# Deep Learning with LabVIEW. Tutorial #1.4: Multi-Output Non-Linear Regression - Sin(x) & Cos(x)

## Outline

## Introduction

We have discussed some of regression types, including logistic and linear regressions. In this blog post, our focus shifts to non-linear regression models. Unlike simple linear regression, which deals with variables linked through a linear relationship described by a straight line equation, non-linear regression is a statistical approach used to model complex relationships between variables, forming curved or non-linear patterns in the data, making it suitable for various fields where data doesn't follow straight-line trends.

In this article, we'll discuss multi-output nonlinear regression and try to predict both the sine and cosine values of a given input simultaneously.

This and other DeepLTK examples are available on ourGitHubrepository.

## Training: 1_Sin(x)_Cos(x)(Training).vi

### Front Panel of Training VI

The front panel showcases two graphs: one depicting the predictions for **sin(x)** and the other for **cos(x)**, compared to their corresponding ground truth values, and errors calculated as difference between predicted and ground truth values. These graphs update upon pressing the "**Eval?**" button. "**Loss Graphs**" displays the history of training and test loss values across the training process.

The "**# of Sine Cycles**" control determines the input variable range in the dataset. In this instance, it's set to 5, indicating that the model will try to predict sine and cosine values within five co/sinusoidal cycles.

The* *"**Loop Time**" indicator shows the time taken for each iteration of the training process.

### Block Diagram of The Training VI

The block diagram consists of *dataset generation*, *network creation *and *configuration, *and *training process itself*. Let's examine each component individually.

#### Dataset Generation

In case of multi-output regression we store Sin(x) and Cos(x) values in first and second columns of the 2D output array respectively. Take a look at the block diagram snapshot below.

We create 2000 training samples and 5000 test samples. The number of cycles defines the dataset's scope. The distribution of data is uniform within the range of 2Ï€ * the number of cycles. Ultimately, the generated samples become our inputs, while the sine function of these samples serves as the network's outputs.

#### Network Creation and Configuration

Calculating (or predicting) trigonometric functions in relatively complex task. For that reason the network comprises 5 layers: an input layer, three hidden layers, and an output layer.

The hidden layers employ the *Swish *activation function, while the output layer doesn't have activation function. The reason behind not using an activation function for the output layer is that our objective is not to predict probabilities; instead, we are aiming to predict the actual sine and cosine values.

#### Training Process

Within the training phase, we train the network using our designed training dataset. In the evaluation phase, the test dataset is employed to assess the accuracy of the network's performance. This involves a comparison between the network's predictions and the actual ground truth values (which are the outputs of the dataset).

## Training the Network

After running the VI, we can see how both the training and test losses gradually decrease on the Loss Chart. By pressing the "**Eval?**" button, the** **graphs reveal a line (predictions) that progressively starts to resemble a sine curve (ground truths). Eventually the predictions will more closely resemble the sine and cosine curves, along with the decrease of the trainings and test losses. As soon as the loss value is not decreasing anymore and the predictions closely align with the ground truth values, we can stop the training process.

Below is the video showcasing the training process.

## Inference: 2_Sin(x)_Cos(x)(Inference).vi

The Front Panel of the inference VI provides possibility to chose a specific value for teh input, and observe the predictions against the ground truths, along with the absolute error percentages between the ground truths and predicted values.

## Summary

In this tutorial, we discovered how to develop deep neural network models for multi-output non-linear regression. This simple example can be a starting point for implementing deep learning based models for solving other non-linear regression problems using __DeepLTK__ (Deep Learning Toolkit for LabVIEW).