Reputation: 12433
I have model which input 129-dimentional complex-number(short time fourier-transfer-ed data)
My example of input is like this below
[-1.3352364e+01+0.0000000e+00j 7.4373883e-01-7.2833991e-16j
2.2738211e+01+6.8436519e-16j -1.4453428e+01+3.1225023e-16j
1.0134536e+00+7.6327833e-17j -5.8692555e+00+7.8409501e-16j
-4.2160640e+00+3.7905386e-16j 9.3214293e+00-7.9450335e-16j
-3.8441191e+00-2.0816682e-16j 1.5526062e+00+9.3675068e-17j
1.8541154e-01+7.5892474e-16j -2.7615318e+00+8.3960616e-16j
6.1850090e+00+5.5511151e-17j -7.3036003e+00-8.9511731e-16j
5.1545906e+00-7.1212077e-16j -2.2619576e+00+0.0000000e+00j
-4.3920875e+00+3.8857806e-16j 9.7775030e+00-1.1102230e-15j
-4.4443369e+00-1.2054651e-16j -1.7421865e+00+3.4694470e-17j
3.4608727e+00+8.0491169e-16j -3.6370997e+00+1.1865509e-15j
6.7330283e-01-7.8668031e-16j 1.2871089e+00-5.2388649e-16j
1.4200196e-01+6.2753622e-16j -2.4753497e+00-1.0234869e-15j
2.4278961e-02-1.1839655e-15j 7.3392744e+00+4.3715032e-16j
-8.1446323e+00+7.2164497e-16j 2.3820071e+00-4.7878368e-16j
5.5490100e-01+9.6275480e-16j 9.6059316e-01-5.1347815e-16j
-1.4272486e+00+4.4408921e-16j -2.2834092e-01-6.9388939e-17j
1.3941400e-01+7.0778396e-16j 7.8855026e-01-1.2420620e-15j
-9.3603629e-01+5.5511151e-17j -8.9871936e-02-3.2612801e-16j
1.1856022e+00+7.0518187e-16j -1.0490714e+00+3.5735304e-16j
4.8156497e-01+1.6393137e-16j 1.9953914e-02-1.5612511e-16j
-1.4316249e-02-4.0680944e-16j -1.9098872e-01-2.4286129e-16j
-6.5025851e-02-2.2204460e-16j -2.7533963e-02-2.8449465e-16j
1.3631889e-01+6.2014686e-16j -1.9419394e-01-4.4408921e-16j
6.0891777e-01+3.3306691e-16j -5.1358789e-01+8.8817842e-16j
4.1886669e-01+8.2314307e-16j -1.1331944e+00-2.7061686e-16j
1.6293223e+00-5.5511151e-17j -1.3963546e+00-3.4000580e-16j
8.9522165e-01-7.8913150e-17j -3.3560959e-01-3.7123082e-16j
-1.4841197e-01-3.3306691e-16j 4.5283544e-01+4.6143644e-16j
-5.2438241e-01+3.4259110e-16j 2.5227445e-01+2.9837244e-16j
4.0655173e-02+3.6776138e-16j -2.1195586e-01+7.9797280e-16j
9.3151316e-02-1.4892768e-15j 4.1130298e-01-4.7853973e-16j
-3.6802697e-01-8.8817842e-16j 2.7421236e-01+2.0098398e-16j
-4.9323350e-01-2.8707995e-16j 7.0892453e-02+9.7838404e-16j
1.4285556e-02+1.8735014e-16j 1.2178756e-01+1.4571677e-16j
5.1822972e-01+1.0149809e-16j -6.0321730e-01-4.6143644e-16j
1.4312959e-01-5.5511151e-16j -2.0424712e-01-6.2796990e-16j
2.0290254e-01-1.1145766e-15j 7.0337042e-02+1.1796120e-16j
-2.2669752e-01+9.4368957e-16j 4.7235081e-01-1.7347235e-16j
-6.8114263e-01-3.7905386e-16j 7.0097405e-01+0.0000000e+00j
-5.3555572e-01+1.1102230e-16j 2.6501888e-01-4.4408921e-16j
-3.2118353e-01-6.2014686e-16j 3.7940162e-01-4.9266147e-16j
-4.3286872e-01-2.2204460e-16j 7.6514846e-01-2.0122792e-16j
-7.9566664e-01-3.1483553e-16j 1.8461785e-01+4.5102810e-17j
4.6878424e-01-1.6479873e-17j -7.7730691e-01+5.3082538e-16j
7.9691464e-01-3.5823718e-16j -5.1372331e-01-1.0061396e-15j
4.5839280e-01-5.5511151e-17j -6.0186821e-01-7.5633944e-16j
5.8818871e-01+1.0685729e-15j -4.3991232e-01+2.9143354e-16j
1.5778032e-01-4.4408921e-16j -6.3726664e-02-1.5265567e-16j
2.4285218e-01+1.5074914e-15j -2.8261366e-01-7.6327833e-17j
9.2593305e-02-5.5511151e-17j -7.3957220e-02+4.5102810e-16j
2.0222366e-01-6.5035362e-17j -2.2292452e-01-3.1225023e-17j
1.7134936e-01+3.0357661e-18j -8.9343295e-02+2.7408631e-16j
6.6628762e-02+1.2054651e-16j -8.4265225e-02-1.8735014e-16j
7.4724592e-02+1.3877788e-16j -4.5830503e-02+2.0816682e-17j
4.0348507e-02+2.3156880e-16j -4.6607938e-02-2.2204460e-16j
3.9488845e-02+1.6653345e-16j -5.3395957e-02-4.4408921e-16j
3.3790331e-02+4.5986944e-17j -1.1470942e-02+7.8409501e-16j
3.6072452e-03-5.5511151e-17j -1.0854214e-02+4.8572257e-17j
5.6150518e-02+6.8436519e-16j -5.3869747e-02+1.2836954e-16j
-4.3637045e-03+1.3877788e-17j 2.3376349e-02-7.5980888e-16j
2.7135586e-02-4.5986944e-17j -3.3272862e-02-1.7347235e-16j
-1.2956693e-02-3.2612801e-16j 2.3436353e-02+1.3183898e-16j
1.5689885e-02-7.3742516e-17j -5.3210557e-02-4.8816203e-17j
5.6559194e-02+0.0000000e+00j]
and output of my model is this
[[ 2.44907394e-01 -2.97553688e-01 2.11519375e-01 -1.90888457e-02
-4.56364267e-02 -6.27458245e-02 -1.32896289e-01 2.92474300e-01
-4.04089779e-01 -1.56403586e-01 -1.92916021e-01 -1.43633649e-01
-1.57259151e-01 5.65262511e-03 -2.09377334e-01 4.94567640e-02
-1.03674516e-01 -1.69391558e-03 -7.67782032e-02 6.16271086e-02
-7.57082552e-02 -5.81801347e-02 5.03328927e-02 -3.21788304e-02
1.44796409e-02 -1.82129852e-02 2.29691751e-02 4.87755574e-02
-3.32594924e-02 -4.09342609e-02 3.63402329e-02 1.22958608e-02
-1.94040649e-02 -8.86565819e-03 2.06985734e-02 1.35932527e-02
-3.36496159e-03 3.11814509e-02 3.27086858e-02 8.05965438e-03
1.59415863e-02 1.15749724e-02 8.10898468e-03 -2.60975584e-03
5.77399507e-03 1.21091865e-02 7.61231408e-03 1.23816207e-02
1.06919296e-02 1.21192187e-02 5.17597422e-03 8.74948129e-03
5.39486483e-03 8.50370154e-03 3.17635015e-03 1.04431063e-03
3.65899876e-03 2.61678174e-03 6.68763369e-03 1.77711621e-03
7.05862418e-03 4.92045656e-03 -1.12678483e-03 5.10105863e-03
7.67963007e-03 4.02958319e-03 1.09087341e-02 4.09850851e-03
-7.14905933e-03 -6.37976453e-03 1.45311467e-02 -1.75617263e-03
-2.48615816e-03 8.45167413e-03 1.35500357e-03 3.68746743e-03
7.73085281e-03 5.56082651e-03 3.27861309e-03 1.69695169e-03
1.68296695e-03 -7.13682547e-03 4.51812893e-03 1.05617158e-02
9.09534469e-03 7.56881759e-03 7.15654343e-04 -3.81373987e-03
-9.41876695e-03 1.34883039e-02 6.52562454e-03 5.85681945e-03
-3.25944275e-04 -3.52438539e-04 5.87854534e-03 4.60745022e-03
1.70308724e-03 4.45364043e-03 3.00474837e-03 5.36788255e-03
4.28943709e-03 1.88645348e-03 1.65197998e-04 3.76204029e-03
4.65429574e-03 2.02246010e-03 3.14211100e-03 3.25421616e-03
3.42429429e-03 4.88381833e-03 4.63513285e-03 1.57951191e-03
3.13404948e-03 2.97084078e-03 4.92273644e-03 1.47051737e-03
2.75985897e-03 3.42904776e-03 3.48226726e-03 4.90953028e-03
3.53986397e-03 2.55738944e-03 2.57845968e-03 3.87272611e-03
3.58704850e-03 2.76022032e-03 3.19864228e-03 3.40151414e-03
3.43684852e-03]]
Shape of numpy is the same (1,129)
However, output has no imaginary number.
Why does it happen?
My model which uses 'LSTM' very simply is this below.
I need to do something special to handle the complex number?
NUM_DIM = 32
NUM_RNN = 100
model.add(LSTM(NUM_DIM, activation=None, input_shape=(NUM_RNN, 1), return_sequences=True))
model.add(Dense(1, activation="linear"))
model.compile(loss='mean_squared_error', optimizer=Adam(lr=0.01, beta_1=0.9, beta_2=0.999))
model.summary()
Upvotes: 0
Views: 1032
Reputation: 346
Machine learning frameworks are not able to handle complex data (yet). There are still some major challenges regarding gradients and activation functions in the complex plane (see for example this article as an overview).
In most approaches that I'm aware of, the real and imaginary parts of the data are simply concatenated, so the network gets only real data. For example, instead of [1 + 2i, 3 + 4i]
the network would get [1, 3, 2, 4]
.
(Your input data looks pretty much 'real-valued' anyway, with an imaginary part many magnitudes below the real part. If this is the case everywhere, you could probably get away with just taking the real part...) EDIT: I just checked - keras networks only use the real part of the data anyway, that's why you still get a (real-valued) output.
Upvotes: 1