Data logs

Simulations log their results in a DataLog. This is a versatile logging class which makes few assumptions about the type of data being logged. When analysing the results of 1d or 2d, rectangular simulations, it can be useful to convert to a DataBlock1d or DataBlock2d.

class myokit.DataLog(other=None, time=None)

A dictionary time-series, for example data logged during a simulation or experiment.

A DataLog is expected but not required to contain a single entry indicating time and any number of entries representing a variable varying over time.

Single cell data is accessed simply by the variable name:

v = log['membrane.V']

Multi-cell data is accessed by appending the index of the cell before the variable name. For example:

v = log['1.2.membrane.V']

This returns the membrane potential for cell (1, 2). Another way to obtain the same result is:

v = log['membrane.V', 1, 2]

or, finally:

v = log['membrane.V', (1, 2)]

Every array stored in the log must have the same length. This condition can be checked by calling the method validate().

A new DataLog can be created in a number of ways:

# Create an empty DataLog:
d = myokit.DataLog()
d['time'] = [1, 2, 3]
d['data'] = [2, 4, 5]

# Create a clone of d
e = myokit.DataLog(d)

# Create a DataLog based on a dictionary
d = myokit.DataLog({'time':[1, 2, 3], 'data':[2, 4, 5]}, time='time')


A DataLog to clone or a dictionary to use as basis.
The log key to use for the time variable. When cloning a log, adding the time argument will overwrite the cloned value.
apd(v='membrane.V', threshold=-70)

Calculates one or more Action Potential Durations (APDs) in a single cell’s membrane potential.

Note 1: More accuracte apd measurements can be created using the Simulation object’s APD tracking functionality. See for details.

Note 2: This APD is defined by simply checking crossing of a threshold potential, and does not look at lowest or highest voltages in a signal.

The membrane potential data should be listed in the log under the key given by v.

The APD is measured as the time that the membrane potential exceeds a certain, fixed, threshold. It does not calculate dynamic thresholds like “90% of max(V) - min(V)”.

The returned value is a list of tuples (AP_start, APD).


Returns a copy of this log as a DataBlock1d.


Returns a copy of this log as a DataBlock2d.


Returns a deep clone of this log.

All lists in the log will be duplicated, but the list contents are assumed to be numerical (and thereby immutable) and won’t be cloned.

A log with numpy arrays instead of lists can be created by setting numpy=True.


Returns a copy of this log, extended with the data of another.

Both logs must have the same keys and the same time key. The added data must be from later time points than in the log being extended.


Deprecated alias of find_after().


Returns the lowest indice i such that

times[i] >= time

where times are the times stored in this DataLog.

If no such value exists in the log, len(time) is returned.

fold(period, discard_remainder=True)

Creates a copy of the log, split with the given period. Split signals are given indexes so that “current” becomes “0.current”, “1.current” “2.current”, etc.

If the logs entries do not divide well by ‘period’, the remainder will be ignored. This happens commonly due to rounding point errors (in which case the remainder is a single entry). To disable this behavior, set discard_remainder=False.


Returns True if one of the variables in this DataLog has a NaN as its final logged value.

integrate(name, *cell)

Integrates a field from this log and returns it:

# Run a simulation and calculate the total current carried by INa
s = myokit.Simulation(m, p)
d =
q = d.integrate('ina.INa')


The name of the variable to return, for example ‘ik1.IK1’ or ‘2.1.membrane.V’.
An optional cell index, for easy access to multi-cellular data, for example log.integrate('membrane.V', 2, 1).
interpolate_at(name, time)

Returns the value for variable name at a given time, determined using linear interpolation between the nearest matching times.


Returns two logs, where the first contains all this log’s entries up to index i, and the second contains all entries starting from i and higher.

itrim(a, b)

Returns a copy of this log, with all entries trimmed to the region between indices a and b (similar to performing x = x[a:b] on a list).


Returns a copy of this log, with all entries before indice i removed (similar to performing x = x[i:] on a list).


Returns a copy of this log, with all entries starting from indice i removed (similar to performing x = x[:i] on a list).


Returns all keys that match the pattern *.query, sorted alphabetically.

For example, log.keys_like('membrane.V') could return a list ['0.membrane.V', '1.membrane.V', '2.membrane,V', ...], or ['0.0.membrane.V', '0.1.membrane.V', '1.0.membrane,V', ...].


Returns the length of the entries in this log. If the log is empty, zero is returned.

static load(filename, progress=None, msg='Loading DataLog')

Loads a DataLog from the binary format used by myokit.

The values in the log will be stored in an array.array. The data type used by the array will be the one specified in the binary file. Notice that an array.array storing single precision floats will make conversions to Float objects when items are accessed.

To obtain feedback on the simulation progress, an object implementing the myokit.ProgressReporter interface can be passed in. passed in as progress. An optional description of the current simulation to use in the ProgressReporter can be passed in as msg.

static load_csv(filename, precision=64)

Loads a CSV file from disk and returns it as a DataLog.

The CSV file must start with a header line indicating the variable names, separated by commas. Each subsequent row should contain the values at a single point in time for all logged variables.

The DataLog is created using the data type specified by the argument precision, regardless of the data type of the stored data.

The log attempts to set a time variable by searching for a strictly increasing variable. In the case of a tie the first strictly increasing variable is used. This means logs stored with save_csv() can safely be read.


Returns a DataLog with numpy array views of this log’s data.

regularize(dt, tmin=None, tmax=None)

Returns a copy of this DataLog with data points at regularly spaced times.

Note: While regularize() can be used post-simulation to create fixed time-step data from variable time-step data, it is usually better to re-run a simulation with fixed time step logging. See for details.

The first point will be at tmin if specified or otherwise the first time present in the log. All following points will be spaced dt time units apart and the final point will be less than or equal to tmax. If no value for tmax is given the final value in the log is used.

This method works by

  1. Finding the indices corresponding to tmin and tmax.
  2. Creating a spline interpolant with all the data from tmin to tmax. If possible, two points to the left and right of tmin and tmax will be included in the interpolated data set (so only if there are at least two values before tmin or two values after tmax in the data respectively).
  3. Evaluating the interpolant at the regularly spaced points.

As a result of the cubic spline interpolation, the function may perform poorly on large data sets.

This method requires SciPy to be installed.

save(filename, precision=64)

Writes this DataLog to a binary file.

The resulting file will be a zip file with the following entries:

A csv file with the fields name, dtype, len for each variable.
The binary data in the order specified by the header.
A text file explaining the file format.

The optional argument precision allows logs to be stored in single precision format, which saves space.

save_csv(filename, precision=64, order=None, delimiter=', ', header=True)

Writes this DataLog to a CSV file, following the syntax outlined in RFC 4180 and with a header indicating the field names.

The resulting file will consist of:

  • A header line containing the names of all logged variables, separated by commas. If present, the time variable will be the first entry on the line. The remaining keys are ordered using a natural sort order.
  • Each following line will be a comma separated list of values in the same order as the header line. A line is added for each time point logged.


The file to write (existing files will be overwritten without warning.
If a precision argument (for example myokit.DOUBLE_PRECISION) is given, the output will be stored in such a way that this amount of precision is guaranteed to be present in the string. If the precision argument is set to None python’s default formatting is used, which may lead to smaller files.
To specify the ordering of the log’s arguments, pass in a sequence order with the log’s keys.
This field can be used to set an alternative delimiter. To use spaces set delimiter=' ', for tabs: delimiter='\t'. Note that some delimiters (for example ‘n’ or ‘1234’) will produce an unreadable or invalid csv file.
Set this to False to avoid adding a header to the file. Note that Myokit will no longer be able to read the written csv file without this header.

A note about locale settings: On Windows systems with a locale setting that uses the comma as a decimal separator, importing CSV files into Microsoft Excel can be troublesome. To correctly import a CSV, either (1) Change your locale settings to use “.” as a decimal separator or (2) Use the import wizard under Data > Get External Data to manually specify the correct separator and delimiter.


Sets the key under which the time data is stored.


Splits the log into a part before and after the time value:

s = myokit.Simulation(m, p)
d =
d1, d2 = d.split(100)

In this example, d1 will contain all values up to, but not including, t=100. While d2 will contain the values from t=100 and upwards.

split_periodic(period, adjust=False, closed_intervals=True)

Splits this log into multiple logs, each covering an equal period of time. For example a log covering the time span [0, 10000] can be split with period 1000 to obtain ten logs covering [0, 1000], [1000, 2000] etc.

The split log files can be returned as-is, or with the time variable’s value adjusted so that all logs appear to cover the same span. To enable this option, set adjust to True.

By default, the returned intervals are closed, so both the left and right endpoint are included (if present in the data). This may involve the duplication of some data points. To disable this behaviour and return half-closed endpoints (containing only the left point), set closed_intervals to False.


Returns this log’s time array.

Raises a myokit.InvalidDataLogError if the time variable for this log has not been specified or an invalid key was given for the time variable.


Returns the name of the time variable stored in this log, or None if no time variable was set.

trim(a, b, adjust=False)

Returns a copy of this log, with all data before time a and after (and including) time b removed.

If adjust is set to True, all logged times will be lowered by a.

trim_left(value, adjust=False)

Returns a copy of this log, with all data before time value removed.

If adjust is set to True, all logged times will be lowered by value.


Returns a copy of this log, with all data at times after and including value removed.


Validates this DataLog. Raises a myokit.InvalidDataLogError if the log has inconsistencies.


Returns a dictionary mapping fully qualified variable names to LoggedvariableInfo instances, providing information about the logged data.

Comes with the following constraints:

  • Per variable, the data must have a consistent dimensionality. For example having a key 0.membrane.V and a key 1.1.membrane.V would violate this constraint.
  • Per variable, the data must be regular accross dimensions. For example if there are n entries 0.x.membrane.V, and there are also entries of the form 1.x.membrane.V then the values of x must be the same for both cases.

An example of a dataset that violates the second constraint is:


If either of the constraints is violated a ValueError is raised.

class myokit.LoggedVariableInfo

Contains information about the log entries for each variable. These objects should only be created by DataLog.variable_info().


Returns the dimensions of the logged data for this variable, as an integer.


Returns an iterator over all available ids for this variable, such that the second index (y in the simulation) changes fastest. For example, for log entries:


the returned result would iterate over:

[(0, 0), (0, 1), (0, 2), (1, 0), (1, 1), (1, 2)]

The keys are returned in the same order as the ids.


Returns True if the following conditions are met:

  • The data 2 dimensional
  • The data is continuous: along each dimension the first data point is indexed as 0 and the last as Ni-1, where Ni is the size in that dimension.

Returns an iterator over all available keys for this variable, such that the second index (y in the simulation) changes fastest. For example, for log entries:


the returned iterator would produce "0.0.membrane.V", then "0.1.membrane.V" etc.

The ids are returned in the same order as the keys.


Returns the variable name.


Returns a tuple containing the size i.e. the number of entries for the corresponding variable in each dimension.

For example, with the following log entries for membrane.V:


the corresponding size would be (3).

A size of 3 doesn’t guarantee the final entry is for cell number 2. For example:


would also return size (3)

In higher dimensions:


This would return (2,3).

Similarly, in a single cell scenario or for global variables, for exmaple:


Would have size ().

myokit.prepare_log(log, model, dims=None, global_vars=None, if_empty=0, allowed_classes=15, precision=64)

Returns a DataLog for simulation classes based on a log argument passed in by the user. The model the simulations will be based on should be passed in as model.

The log argument can take on one of four forms:

An existing simulation log
In this case, the log is tested for compatibility with the given model and simulation dimensions. For single-cell simulations all keys in the log must correspond to the qname of a loggable variable (IE not a constant). For multi-cellular simulations this means all keys in the log must have the form “x.component.variable” where “x” is the cell index (for example “1” or “0.3”).
A list (or other sequence) of variable names to log.
In this case, the list is converted to a DataLog object. All arguments in the list must be either strings corresponding to the variables’ qnames (so “membrane.V”) or variable objects from the given model. In multi-cell scenarios, passing in the qname of a variable (for example “membrane.V”) will cause every cell’s instance of this variable to be logged. To log only specific cells’ values, pass in the indexed name (for example “1.2.membrane.V”).
An integer flag

One of the following integer flags:

Don’t log any variables.
Log all state variables.
Log all variables bound to an external value. The method will assume any bound variables still present in the model will be provided by the simulation engine.
Log all intermediary variables.
Log the derivatives of the state variables.
Combines all the previous flags.

Flags can be chained together, for example log=myokit.LOG_STATE+myokit.LOG_BOUND will log all bound variables and all states.

In this case the value from if_empty will be copied into log before the function proceeds to build a log.

For multi-dimensional logs the simulation dimensions can be passed in as a tuple of dimension sizes, for example (10,) for a cable of 10 cells and (30,20) for a 30 by 20 piece of tissue.

Simulations can define variables to be either per-cell or global. Time, for example, is typically a global variable while membrane potential will be stored per cell. To indicate which is which, a list of global variables can be passed in as global_vars.

The argument if_empty is used to set a default argument if log is is given as None.

The argument allowed_classes is an integer flag that determines which type of variables are allowed in this log.

When a new DataLog is created by this method, the internal storage uses arrays from the array module. The data type for these new arrays can be specified using the precision argument.


Splits a log entry name into a cell index part and a variable name part.

The cell index will be an empty string for 0d entries or global variables. For higher dimensional cases it will be the cell index in each dimension, followed by a period, for example: 15.2..

The two parts returned by split_key may always be concatenated to obtain the original entry.