The Pineapple
The Pineapple

Reputation: 567

Solving set of ODEs with Scipy

I am trying to develop an algorithm (use scipy.integrate.odeint()) that predicts the changing concentration of cells, substrate and product (i.e., 𝑋, 𝑆, 𝑃) over time until the system reaches steady- state (~100 or 200 hours). The initial concentration of cells in the bioreactor is 0.1 𝑔/𝐿 and there is no glucose or product in the reactor initially. I want to test the algorithm for a range of different flow rates, 𝑄, between 0.01 𝐿/ℎ and 0.25 𝐿/ℎ and analyze the impact of the flow rate on product production (i.e., 𝑄 ⋅ 𝑃 in 𝑔/ℎ). Eventually, I would like to generate a plot that shows product production rate (y-axis) versus flow rate, 𝑄, on the x-axis. My goal is to estimate the flow rate that results in the maximum (or critical) production rate. This is my code so far:

from scipy.integrate import odeint
import numpy as np

# Constants
u_max = 0.65
K_s = 0.14
K_1 = 0.48
V = 2
X_in = 0
S_in = 4
Y_s = 0.38
Y_p = 0.2

# Variables
# Q - Flow Rate (L/h), value between 0.01 and 0.25 that produces best Q * P
# X - Cell Concentration (g/L)
# S - The glucose concentration (g/L)
# P - Product Concentration (g/L)

# Equations
def func_dX_dt(X, t, S):
    u = (u_max) / (1 + (K_s / S))
    dX_dt = (((Q * S_in) - (Q * S)) / V) + (u * X)
    return dX_dt

def func_dS_dt(S, t, X):
    u = (u_max) / (1 + (K_s / S))
    dS_dt = (((Q * S_in) - (Q * S)) / V) - (u * (X / Y_s))
    return dS_dt

def func_dP_dt(P, t, X, S):
    u = (u_max) / (1 + (K_s / S))
    dP_dt = ((-Q * P) / V) - (u * (X / Y_p))
    return dP_dt

t = np.linspace(0, 200, 200)

# Q placeholder
Q = 0.01

# Attempt to solve the Ordinary differential equations
sol_dX_dt = odeint(func_dX_dt, 0.1, t, args=(S,))
sol_dS_dt = odeint(func_dS_dt, 0.1, t, args=(X,))
sol_dP_dt = odeint(func_dP_dt, 0.1, t, args=(X,S))

In the programs current state there does not seem to be be a way to generate the steady state value for P. I attempted to make this modification to get the value of X.

sol_dX_dt = odeint(func_dX_dt, 0.1, t, args=(odeint(func_dS_dt, 0.1, t, args=(X,)),))

It produces the error:

NameError: name 'X' is not defined

At this point I am not sure how to move forward.

(Edit 1: Added Original Equations)

First Equation

Second Equation and Third Equation

Upvotes: 0

Views: 3066

Answers (1)

eyllanesc
eyllanesc

Reputation: 243975

You do not have to apply the functions to each part but return a tuple of the derivatives as I show below:

import numpy as np
from scipy.integrate import odeint
import matplotlib.pyplot as plt

Q = 0.01
V = 2
Ys = 0.38
Sin = 4
Yp = 0.2
Xin = 0
umax = 0.65
Ks = 0.14
K1 = 0.48

def mu(S, umax, Ks, K1):
    return umax/((1+Ks/S)*(1+S/K1))

def dxdt(x, t, *args):
    X, S, P = x
    Q, V, Xin, Ys, Sin, Yp, umax, Ks, K1 = args
    m = mu(S, umax, Ks, K1)
    dXdt = (Q*Xin - Q*X)/V + m*X
    dSdt = (Q*Sin - Q*S)/V - m*X/Ys
    dPdt = -Q*P/V - m*X/Yp
    return dXdt, dSdt, dPdt

t = np.linspace(0, 200, 200)
X0 = 0.1
S0 = 0.1
P0 = 0.1
x0 = X0, S0, P0
sol = odeint(dxdt, x0, t, args=(Q, V, Xin, Ys, Sin, Yp, umax, Ks, K1))

plt.plot(t, sol[:, 0], 'r', label='X(t)')
plt.plot(t, sol[:, 1], 'g', label='S(t)')
plt.plot(t, sol[:, 2], 'b', label='P(t)')
plt.legend(loc='best')
plt.xlabel('t')
plt.grid()
plt.show()

Output:

enter image description here

Upvotes: 1

Related Questions