mrkurtan
mrkurtan

Reputation: 591

Signal classification - recognise a signal with AI

I have a problem with recognising a signal. Let say the signal is a quasiperiodic signal, the period time has finite limits. The "shape" of the signal must match some criteria, so the actual algorithm using signal processing techniques such as filtering, derivating the the signal, looking for maximum and minimum values. It has a good rate at finding the good signals, but the problem is it also detects wrong shapes too.

So I want to use Aritifical Intelligence - mainly Neural Networks - to overcome this problem. I thought that a multi layer network with some average inputs (the signal can be reduced) and one output whould shows the "matching" from 0..1. However the problem is that I never did such a thing, so I am asking for help, how to achive something like this? How to teach the neural network to get the expected results? (let say I have vectors for inputs which should give 1 as output)

Or this whole idea is a wrong approximation for the problem? I am open to any learning algorithms or idea to learn and use to overcome on this problem.

So here is a figure on the measured signal(s) (values and time is not a concern now) and you can see a lot "wrong" ones, the most detected signals are good as mentioned above. enter image description here

Upvotes: 1

Views: 605

Answers (3)

Kumar Roshan Mehta
Kumar Roshan Mehta

Reputation: 3316

You can try 1D convolution. So the basic idea is you give a label 0: bad, 1: good to each signal value at each timestamp. After this, you can model

model = Sequential()
model.add(Conv1D(filters=64, kernel_size=3, activation='relu', padding = 'same', input_shape=(1,1)))
model.add(Bidirectional(LSTM(20, return_sequences=True)))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu', padding = 'same'))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(100, activation='sigmoid'))
model.add(Dense(2, activation='softmax'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Train the model and then give it a new signal to predict. It will predict given series to 0 and 1 values. if count of 0 is more than count of 1, the signal is not good.

Upvotes: 1

Hearty
Hearty

Reputation: 543

Your question can be answered in a broad manner. You should consider editing it to prevent it to be closed.

But anyway, Matlab had a lot of built-in function and toolbox to support Artificial Intelligence, with a lot of sample code available, which you can modify and refer to. You can find some in Matlab FileExchange.

And I know reading a lot of technical paper for Artificial Intelligence is a daunting task, so good luck!

Upvotes: 1

Mihai8
Mihai8

Reputation: 3147

You can try to build a neural network using Neuroph. You can inspire from "http://neuroph.sourceforge.net/TimeSeriesPredictionTutorial.html". On the other hand, it is possible to approximate the signal using Fourier transformation.

Upvotes: 1

Related Questions