Sync data from common measurement frameworks
QCoDeS Synchronization
To add a QCoDeS database for synchronization:
Launch the sync agent GUI
Open the Add Source menu, click on QCoDeS tab.
Define a name that indicates what database is being synchronized,
the set-up at which your measurement were measured and select your QCoDeS database, e.g.
mydatabase.db.
The synchronization should begin immediately once the database is selected.
Note
It is also to add the QCoDeS database programmatically by running the following code.
import pathlib
from etiket_client.sync.backends.qcodes.qcodes_sync_class import QCoDeSSync, QCoDeSConfigData
from etiket_client.sync.backends.sources import add_sync_source
from etiket_client.python_api.scopes import get_scope_by_name
data_path = pathlib.Path('/path/to/my/database.db')
scope = get_scope_by_name('scope_name')
# optional: add extra attributes
extra_attributes = {
'attribute_name': 'attribute_value'
}
configData = QCoDeSConfigData(database_directory=data_path, set_up = "my_setup", extra_attributes = extra_attributes)
add_sync_source('my_sync_source_name', QCoDeSSync, configData, scope)
Adding QCoDeS metadata to a dataset
You can attach tags and attributes to a dataset by adding the QHMetaData instrument to your QCoDeS Station. Because the instrument is recorded in the QCoDeS snapshot, we can extract the tags and attributes and add them to the QHarbor dataset.
You can add both static and dynamic metadata:
Static metadata stays the same for all your measurements (e.g., project, lab PC).
Dynamic metadata changes per run (e.g., calibration type, measurement type, …)
Initializing the instrument
There are two ways to define the instrument: directly in Python or via a Station YAML file.
Using Python
from qdrive.utility.qcodes_metadata import QHMetaData
from qcodes import Station
station = Station()
# Create and register the instrument
qh_meta = QHMetaData("qh_meta",
static_tags=["cool_experiment"],
static_attributes={"project": "resonator project"},
)
station.add_component(qh_meta)
Note
The instrument name is fixed to qh_meta; do not change this.
Using a YAML configuration file
Using a Station YAML config:
instruments:
qh_meta:
type: qdrive.utility.qcodes_metadata.QHMetaData
init:
static_tags:
- "cool_experiment"
static_attributes:
"project": "resonator project"
"measurement_computer": "LAB-A-PC-5"
Note
The instrument name is fixed to qh_meta; do not change this key.
The instrument can be loaded as:
from qcodes import Station
# replace by your qc_config.yaml name
station = Station(config_file='qc_config.yaml')
station.load_instrument("qh_meta")
For more information, see the QCoDeS docs: Configuring the Station by using a YAML configuration file.
Runtime usage
Access the instrument and set per-run metadata in one place:
# If defined via Station YAML
qh_meta = station.components["qh_meta"]
# Start of run: clear dynamic values, then set new ones
qh_meta.reset()
qh_meta.add_tags(["testing"]) # e.g. ["dynamic_tag1", "dynamic_tag2"]
qh_meta.add_attributes({
"calibration": "ALLXY", # e.g. {"key": "value"}
})
# measurement happens here
Behavior notes
Static vs dynamic: Static tags/attributes are set at construction and persist across
reset(). Dynamic ones are cleared byreset().Overrides: Dynamic attributes with the same key as a static attribute will override the static value in the combined view.
Tags order/duplicates: Dynamic tags are appended; duplicates are preserved.
Quantify (QMI) Synchronization
For Quantify data, the expected folder structure should resemble the following format:
main_folder
├── 20240101
│ ├── 20240101-211245-165-731d85-experiment_1
│ │ ├── 01-01-2024_01-01-01.json
│ │ ├── 01-01-2024_01-01-01.hdf5
├── 20240102
│ ├── 20240102-220655-268-455d85-experiment_2
│ │ ├── 02-01-2024_02-02-02.json
│ │ ├── 02-01-2024_02-02-02.hdf5
To set up synchronization for Quantify data:
Launch the sync agent GUI
Open the Add Source menu, click on Quantify tab.
Define a name that indicates what database is being synchronized, the set-up at which your measurement were measured and select the folder containing your Quantify data, e.g.,
main_folderin this example.
The synchronization should start automatically after the folder is selected.
Labber Synchronization
To set up synchronization for Labber data, you need to provide the path to your Labber dataset directory.
Labber typically organizes data in a hierarchical folder structure. For example:
main_folder
├── 2024
│ ├── 01
│ │ ├── Data_0101
│ │ │ ├── measurement_1.hdf5
│ │ │ ├── measurement_2.hdf5
│ │ ├── Data_0102
│ │ │ ├── measurement_1.hdf5
│ │ │ ├── measurement_2.hdf5
In this structure, you should provide the root folder (main_folder) to the sync agent.
Setting up Labber synchronization programmatically:
import pathlib
from etiket_client.sync.backends.labber.labber_sync_class import LabberSync, LabberConfigData
from etiket_client.sync.backends.sources import add_sync_source
from etiket_client.python_api.scopes import get_scope_by_name
# Define the path to your Labber data directory
data_path = pathlib.Path('C:/path/to/your/labber/main_folder')
# Get the target scope for synchronization
scope = get_scope_by_name('my_scope')
# Configure the Labber synchronization settings
config_data = LabberConfigData(
labber_directory=data_path,
set_up="my_experimental_setup"
)
# Add the synchronization source
add_sync_source(
'labber_sync_source',
LabberSync,
config_data,
scope
)
Note
Support for adding Labber synchronization through the GUI is coming soon.
Core-Tools Synchronization
First make sure to install both pulse-lib and core-tools in the same environment as qDrive by running:
pip install git+https://github.com/stephanlphilips/pulse_lib
pip install git+https://github.com/stephanlphilips/core_tools
To configure synchronization with Core-Tools, you’ll need the credentials for the Core-Tools database.
These credentials are usually stored in the ct_config.yaml file or initialized within the core-tools setup, for example:
from core_tools.data.SQL.connect import SQL_conn_info_local
SQL_conn_info_local(dbname = 'dbname', user = "user_name", passwd="password",
host = "localhost", port = 5432)
Launch the sync agent GUI
Open the Add Source menu, click on Core-Tools tab.
Fill in the core_tools database details
Warning
Please only sync data from the local postgreSQL database.