lukygee
lukygee

Reputation: 51

Machine learning with vectors in both features and target

How can I train a model with vectors/arrays as features? I seem to consistently getting errors when doing this...

My feature matrix would look something like this:

     A    B    C    Profile
0    1    4    4    [1,2,3,4]
1    2    4    5    [2,2,4,1]

while my target vector would look something like this:

0    [0,4,5,0]
1    [1,5,6,0]

etc etc but I'm having trouble with fit(x, y) when using linear_regression from sklearn. Here is the output to print(x) and print(y):

x:

Beams/Beam[0]/Parameters/Energy     Beams/Beam[0]/Parameters/BunchPopulation    Beams/Beam[0]/BunchShape/Parameters/LongitudinalSigmaLabFrame   Simulation/NumberOfParticles    initialXHist
0   25.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
1   25.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
2   25.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
3   25.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
4   25.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
...     ...     ...     ...     ...     ...
995     26.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
996     26.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
997     26.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
998     26.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...
999     26.0    1.300000e+11    1.05    5000    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ...

1000 rows × 5 columns

y:

0      [8, 4, 6, 13, 5, 5, 10, 11, 15, 9, 19, 18, 16,...
1      [6, 5, 8, 8, 9, 12, 6, 20, 9, 20, 18, 12, 24, ...
2      [6, 6, 7, 8, 13, 10, 12, 7, 14, 14, 18, 24, 16...
3      [2, 5, 10, 3, 6, 8, 13, 12, 7, 18, 12, 20, 22,...
4      [5, 3, 5, 9, 8, 8, 8, 9, 14, 13, 10, 15, 21, 1...
                             ...                        
995    [2, 9, 4, 5, 10, 5, 10, 15, 16, 13, 12, 13, 21...
996    [2, 3, 5, 5, 11, 15, 18, 15, 14, 13, 16, 17, 1...
997    [4, 5, 6, 8, 5, 7, 7, 26, 13, 16, 17, 16, 17, ...
998    [1, 3, 5, 7, 5, 6, 16, 10, 17, 12, 12, 18, 24,...
999    [3, 4, 8, 9, 8, 4, 14, 17, 11, 16, 7, 20, 14, ...
Name: finalXHist, Length: 1000, dtype: object

Can anyone advise? The error I get is:

    ---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
TypeError: only size-1 arrays can be converted to Python scalars

The above exception was the direct cause of the following exception:

ValueError                                Traceback (most recent call last)
/tmp/ipykernel_826/1502489859.py in <module>
      3 
      4 # Train the model using the training sets
----> 5 regr.fit(X_train, y_train)
      6 
      7 # Make predictions using the testing set

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/linear_model/_base.py in fit(self, X, y, sample_weight)
    516         accept_sparse = False if self.positive else ['csr', 'csc', 'coo']
    517 
--> 518         X, y = self._validate_data(X, y, accept_sparse=accept_sparse,
    519                                    y_numeric=True, multi_output=True)
    520 

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/base.py in _validate_data(self, X, y, reset, validate_separately, **check_params)
    431                 y = check_array(y, **check_y_params)
    432             else:
--> 433                 X, y = check_X_y(X, y, **check_params)
    434             out = X, y
    435 

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/utils/validation.py in inner_f(*args, **kwargs)
     61             extra_args = len(args) - len(all_args)
     62             if extra_args <= 0:
---> 63                 return f(*args, **kwargs)
     64 
     65             # extra_args > 0

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/utils/validation.py in check_X_y(X, y, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, multi_output, ensure_min_samples, ensure_min_features, y_numeric, estimator)
    869         raise ValueError("y cannot be None")
    870 
--> 871     X = check_array(X, accept_sparse=accept_sparse,
    872                     accept_large_sparse=accept_large_sparse,
    873                     dtype=dtype, order=order, copy=copy,

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/utils/validation.py in inner_f(*args, **kwargs)
     61             extra_args = len(args) - len(all_args)
     62             if extra_args <= 0:
---> 63                 return f(*args, **kwargs)
     64 
     65             # extra_args > 0

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/sklearn/utils/validation.py in check_array(array, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, ensure_min_samples, ensure_min_features, estimator)
    671                     array = array.astype(dtype, casting="unsafe", copy=False)
    672                 else:
--> 673                     array = np.asarray(array, order=order, dtype=dtype)
    674             except ComplexWarning as complex_warning:
    675                 raise ValueError("Complex data not supported\n"

ValueError: setting an array element with a sequence.

I've tried googling it but no luck so far, I guess there is something wrong with the way these two objects are set up.

Upvotes: 0

Views: 275

Answers (1)

Ben Reiniger
Ben Reiniger

Reputation: 12602

The error is being raised for X (third-to-last part of the traceback): you cannot have an array-valued feature. You need to do some feature engineering to generate a flat table of data to train on; whether that's flattening the arrays into individual features, or extracting some statistic based on those arrays, or something else depends on what those arrays mean (and would be a better question for datascience.SE or stats.SE).

Having arrays for y may have a similar issue, but if treating them as individual outputs is what you're after, it becomes either a "multioutput" regression or a "multilabel" classification, which are handled by subsets of sklearn estimators.

Upvotes: 1

Related Questions