SIRD (Discrete variant)
This example shows how to calibrate a SIRD model. This model is a variation of the SIR model that separates the Recovered compartment into two: Recovered with immunity and Deceased.
The system of equations that describe the changes of the compartments over time is:
Model implementation
The SIRD model is implemented with the help of the Scipy python package, which provides methods to solve a system of ODEs.
The model receives 7 configurable parameters:
n: The number of effective population involved in the model
initial_i, initial_r, initial_d: The number of people in the infected, recovered, and dead compartments, respectively.
beta: \(\beta\) in the ODE system
gamma: \(\gamma\) in the ODE system
mu: \(\mu\) in the ODE system
@ac
def model(
data,
n: Int(70_000, 500_000) = 70_000,
initial_i: Int(1, 1_000) = 10,
initial_r: Int(0, 1_000) = 4,
initial_d: Int(0, 1_000) = 0,
beta: Real(0.1, 1.0) = 0.5,
gamma: Real(0.1, 1.0) = 0.2,
mu: Real(0.1, 1.0) = 0.2
) -> float:
pass
The ODE system is modeled using the function sird_ode
.
Note that the \(\beta\) is normalized (divided by \(n\))
at the start of the function model
.
def sird_ode(x, t, beta, gamma, mu):
S, I, R, D = x
dS_dt = -beta * S * I
dI_dt = beta * S * I - (gamma + mu) * I
dR_dt = gamma * I
dD_dt = mu * I
return dS_dt, dI_dt, dR_dt, dD_dt
Using Scipy we solve the ODE system:
import scipy.integrate
import numpy
def model(...):
beta = beta / n
n_days = 40
microsteps = 100
time = numpy.linspace(
0, n_days - 0.0001, num=n_days*microsteps)
x0 = (
n - initial_i - initial_r - initial_d,
initial_i,
initial_r,
initial_d
)
ode_sol = scipy.integrate.odeint(
sird_ode, x0, time, args=(beta, gamma, mu))
Finally, we compute and return the MSE of the predicted infected
vs the real infected.
This assumes a dataset that has a column named ‘#infected’.
As the ode_sol
will be a numpy array with \(n\_days * microsteps\)
elements for each compartment, in compute_error
we keep the first
number of infected per each day, with the help of
numpy.searchsorted.
def model(...) -> float:
...
ode_sol = ...
real_inf = data['#infected'].values[:n_days]
return compute_mse(real_inf, ode_sol,
n_days, time)
def compute_mse(real_infected, ode_sol,
n_days, timestamps):
pred = numpy.zeros(n_days)
for d in range(n_days):
first_for_day = numpy.searchsorted(timestamps, d)
pred[d] = ode_sol[first_for_day, 1]
return ((real_infected - pred)**2).sum() / n_days
Finally, we modify our entrypoint to load the dataset as a Pandas dataframe to pass it to the model:
import pandas
def entrypoint(data, seed):
data = pandas.read_csv(data, index_col="date")
cost = model(data)
print(f"Result: {cost}")
Preparing the model to be run using docker
In order to execute the model in a container using Docker, we do the following:
- Dockerfile:
This project does not require additional software to be installed, so we keep the Dockerfile as it is.
- requirements.txt:
We add the required packages for model.py to the requirements file:
optilog==0.3.5 scipy==1.9.3 pandas==1.5.1 numpy==1.23.4
- Makefile (OPTIONAL):
We change the name of the image:
# ... # Change the following as desired IMAGE_NAME=sird_discrete TAG=latest image_full_name=${IMAGE_NAME}:${TAG} # ...
Example of output
Once the calibration process finishes, the software will output the best configuration found during the calibration.
The best parameters are listed for each configurable function as
function(parameter): value
.
Also, the configuration JSON is reported as it can be used to inject
the parameters using OptiLog.
[...]
Tuning process finished.
========================
Best parameters:
- model(n): 453806
- model(initial_i): 36
- model(initial_r): 311
- model(initial_d): 11
- model(beta): 0.4754266080892542
- model(gamma): 0.1425160712202053
- model(mu): 0.8994065043911625
========================
Best Config JSON:
{'_0_0__n': 453806, '_1_0__initial_i': 36, [...]}