File

lumicks.pylake.File

class File(filename, *, rgb_to_detectors=None)

A convenient HDF5 file wrapper for reading data exported from Bluelake

Parameters:
  • filename (str | os.PathLike) – The HDF5 file to open in read-only mode

  • rgb_to_detectors (Dict[Color, str]) – Dictionary that maps RGB colors to a photon detector channel (either photon counts, or photon time tags)

Examples

from lumicks import pylake

file = pylake.File("example.h5")
file.force1x.plot()
file.kymos["name"].plot()

# Open with custom detector mapping
file = pylake.File("example.h5", rgb_to_detectors={"Red": "Detector 1", "Green": "Detector 2", "Blue": "Detector 3"})
__getitem__(item)

Return a subgroup or a bluelake timeline channel

classmethod from_h5py(h5py_file, *, rgb_to_detectors=None)

Directly load an existing h5py.File

save_as(filename, compression_level=5, omit_data=None, *, crop_time_range=None, verbose=True)

Write a modified h5 file to disk.

When transferring data, it can be beneficial to omit some channels from the h5 file, or use a higher compression ratio. High frequency channels tend to take up a lot of space and aren’t always necessary for every single analysis. It is also worth mentioning that Bluelake exports files at compression level 1 for performance reasons, so this function can help reduce the file size even when no data is omitted.

Parameters:
  • filename (str | os.PathLike) – Output file name.

  • compression_level (int) – Compression level for gzip compression (default: 5).

  • omit_data (str or iterable of str, optional) – Which data sets to omit. Should be a set of h5 paths (e.g. {“Force HF/Force 1y”}). fnmatch patterns are used to specify which fields to omit, which means you can use wildcards as well (see examples below).

  • crop_time_range (tuple of np.int64, optional) – Specify a time interval to crop to (tuple of a start and stop time). Interval must be specified in nanoseconds since epoch (the same format as timestamps).

  • verbose (bool, optional) – Print verbose output. Default: True.

Examples

import lumicks.pylake as lk

file = lk.File("example.h5")

# Saves a file with a high compression level
file.save_as("smaller.h5", compression_level=9)

# Omit high frequency force data.
file.save_as("no_hf.h5", omit_data="Force HF/*")

# Omit Force 1y data
file.save_as("no_hf.h5", omit_data="*/Force 1y")

# Omit Force 1y and 2y data
file.save_as("no_hf.h5", omit_data=("*/Force 1y", "*/Force 2y"))

# Omit high frequency force data for channel 1y
file.save_as("no_1y.h5", omit_data="Force HF/Force 1y")

# Omit Scan "1"
file.save_as("no_scan_1.h5", omit_data="Scan/1")

# Save only the region that contains the kymograph `kymo1`.
kymo = file.kymos["kymo1"]
file.save_as("only_kymo.h5", crop_time_range=(kymo.start, kymo.stop))
property bluelake_version: str

The version of Bluelake which exported this file

property description: str

The description of the measurement as entered by the user in Bluelake

property experiment: str

The name of the experiment as entered by the user in Bluelake

property export_time: int

The moment this file was exported

property fdcurves: Dict[str, FdCurve]

FdCurves stored in the file

property format_version: int

The version of the Bluelake-specific HDF5 file structure

property guid: str

An ID which uniquely identifies each exported file

property kymos: Dict[str, Kymo]

Kymos stored in the file

property markers: Dict[str, Marker]

Markers stored in the file

property notes: Dict[str, Note]

Notes stored in the file

property point_scans: Dict[str, Scan]

Point Scans stored in the file

property scans: Dict[str, Scan]

Confocal Scans stored in the file