Reputation: 29
This is the example keras code that I want to convert to pytorch. My input dataset is 10000*1*102 (two dimensions for labels). The dataset includes 10000 samples. Each sample contains one row with 102 features. I am thinking to use 1dcnn for regression.
PS: hyper-parameter (e.g. filters, kernel_size, stride, padding) could be adjusted based on my 10000*1*102 dataset.
model = Sequential()
model.add(Conv1D(filters=64, kernel_size=3, activation='relu', input_shape=(n_timesteps,n_features)))
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(100, activation='relu'))
model.add(Dense(n_outputs, activation='softmax'))
Upvotes: 0
Views: 1688
Reputation: 1
Name: torch
Version: 1.11.0.dev20211231+cu113
This article has been very helpful. But the code seems to have changed a little as of the current standards, so I'm leaving a reply.
import torch.nn as nn
from torchsummary import summary as summary_
n_timesteps = 10000
n_features = 102
n_outputs = 1
a0 = nn.Conv1d(n_features, 64, 3)
a1 = nn.ReLU()
b0 = nn.Conv1d(64, 64, 3)
b1 = nn.ReLU()
c0 = nn.Dropout(p=0.5)
d0 = nn.MaxPool1d(2)
e0 = nn.Flatten()
e1 = nn.Linear(319872,100)
e2 = nn.ReLU()
e3 = nn.Linear(100,n_outputs)
f0 = nn.Softmax(dim=1)
model = nn.Sequential(a0,a1,b0,b1,c0,d0,e0,e1,e2,e3,f0)
model.to('cuda')
summary_(model,(n_features,n_timesteps),batch_size=1)
Upvotes: 0
Reputation: 197
Welcome to pytorch. :) I am really glad you decide to switch from Keras to PyTorch. It was an important step for me to understand how NNs work in more detail. If you have any specific questions about code or if it isn't working please let me know.
import torch.nn as nn
a0 = nn.Conv1D(n_timesteps, 64, 3)
a1 = nn.Relu()
b0 = nn.Conv1D(64, 64, 3)
b1 = nn.Relu()
c0 = torch.nn.Dropout(p=0.5)
d0 = nn.MaxPool1d(2)
e0 = nn.Flatten()
e1 = nn.Linear(32*n_timesteps,100)
e2 = nn.Relu()
e3 = nn.Linear(n_outputs)
f0 = nn.Softmax(dim=1)
model = nn.Sequential(a0,a1,b0,b1,c0,d0,e0,e1,e2,e3,f0)
Upvotes: 2