Most scientific teams start with the same toolbox: MatLab, a few Python scripts, LabVIEW if they have a legacy license, maybe a Jupyter notebook for data analysis. It's fast to prototype in, every PhD student knows it, and for a single instrument it's honestly fine.

The problem starts the moment your experiment stops being "take one measurement" and becomes "coordinate five instruments in a specific sequence, log everything with timestamps, catch failures gracefully, and let a non-programmer run the whole thing at 3 AM." That's where MatLab starts falling apart and where we get the phone call.

When MatLab isn't enough

We don't have anything against MatLab. It's a perfectly good environment for exploratory analysis and quick prototyping. But it was never designed to be an instrument control platform for production research setups, and it shows as soon as your experiment gets complex.

The specific moments where our clients start asking for a custom application:

  • Multiple instruments in a sequence - you need to tell the power supply to ramp up, wait for the laser to stabilize, trigger the spectrometer, capture the detector output, move the motion stage to the next position, and repeat - all with exact timing between steps.
  • Real-time feedback loops - the experiment has to react to measurements as they come in. If the sample temperature drifts, the next step changes. MatLab's event loop doesn't handle this cleanly.
  • Long-running experiments - 12, 24, 72 hours of continuous measurements where one crash means a lost week. You need graceful error handling, auto-recovery, and watchdogs.
  • Multiple users - PhD students, postdocs, visiting researchers all need to run the setup without understanding the code. They need a UI they can trust.
  • Data integrity at scale - when you're generating gigabytes per hour, you need proper file formats, metadata, versioning, and a way to query results later.
  • Reproducibility - every result needs to be traceable to the exact instrument settings, firmware version, and calibration state that produced it.

You can kind of do all of this in MatLab. We've seen labs try. The code ends up being 15000 lines of .m files, nobody but the original author understands it, and when that person graduates the whole thing becomes untouchable. Six months later the lab is running parallel experiments on the side and paying us to rebuild it properly.

What a "real" scientific application looks like

When we build custom software for an experiment, it usually has four layers:

1. Instrument drivers

A clean abstraction for each piece of hardware - whether it's a spectrometer over USB, a laser controller over RS485, a DAQ card over PCIe, a motion stage over Ethernet, or a custom board we built ourselves. Each driver exposes a simple interface: connect(), set_parameter(), measure(), shutdown(). If a vendor ships an SDK, we wrap it. If they don't, we implement the protocol from their manual.

2. Sequence engine

The heart of any experiment control application. This is where you define "what happens in what order" - in a way that's deterministic, inspectable, and safe to rerun. We usually implement this as a state machine with explicit transitions, not as a script. That means if something fails in step 3 of 12, you know exactly where you are and can decide whether to retry, skip, or abort.

3. Data pipeline

Every reading gets timestamped, tagged with the current experiment metadata, and written to disk (or database) in a format that's easy to query later. We use HDF5 for big numerical datasets, SQLite or Postgres for metadata, and sometimes InfluxDB for time-series. The raw data is never "lost in a running buffer" - everything that matters gets persisted immediately.

4. User interface

A real UI that a non-programmer can operate. Live plots of whatever the experiment is measuring, big readable status indicators, a clear "start / pause / stop" flow, error messages that say what to do next. On desktop we typically build with Python + Qt, Electron if the team wants web technologies, or native Windows apps in C# when there are Windows-specific drivers involved.

Technologies we actually use

Every project is different, but our usual stack looks something like this:

Backend / control logic

  • Python - the lingua franca of scientific computing. Easy to hire for, massive ecosystem (NumPy, SciPy, PyVISA, pySerial, asyncio), fast enough for 95% of experiments.
  • C++ - when we need hard real-time performance or have to integrate with low-level vendor SDKs.
  • C# - for Windows-native applications where the instruments ship with .NET SDKs.

UI

  • PyQt / PySide - best balance of power and productivity for desktop scientific apps.
  • Electron + React - when the customer wants a web-technology UI or needs the same app to run on Windows, Mac and Linux without rebuilding.
  • Streamlit - for quick internal dashboards and data exploration, not for production control.

Data and persistence

  • HDF5 - the de-facto standard for scientific datasets. Hierarchical, self-describing, fast to query.
  • SQLite / PostgreSQL - for experiment metadata, runs, parameter sweeps.
  • InfluxDB - when the data is time-series and the team wants Grafana-style visualization.

Communication with instruments

  • VISA, SCPI, Modbus, serial, USB-TMC, TCP/IP, custom binary protocols - we've written clients for all of them
  • PyVISA as the starting point, custom drivers when the vendor stack is too limited

Analysis layer

  • NumPy, SciPy, pandas for numerical work
  • Matplotlib, Plotly, Bokeh for visualization
  • Jupyter notebooks for post-experiment analysis
  • scikit-learn or PyTorch when ML models become part of the pipeline

Real examples from our work

Diffusion cloud chamber control

Our cloud chamber instrument needs to manage a three-stage Peltier stack, monitor multiple temperature probes, control the illumination system, trigger a camera for track capture, and log everything continuously. We built the full control application in Python with a PyQt interface. The operator sees live temperature curves for each Peltier stage, the current chamber state (cooling / ready / imaging / fault), and can start a session with a single click. The underlying sequence engine handles cold-soak, stabilization, and data capture automatically. When something goes wrong - a thermal runaway, a loose sensor - the app brings the system to a safe state and logs exactly what happened for later analysis.

Spectroscopy application

For a research client doing thermoelectric electroporation spectroscopy, we built a custom application that coordinates a cuvette holder with precise non-contact temperature control, a spectrometer for absorbance readings, and a programmable waveform generator that drives the sample. The experiment protocol requires ramping temperature at a specific rate, firing a programmed voltage pulse, and capturing the optical response at sub-millisecond resolution - all while keeping everything synchronized with the sample's measured state. The existing MatLab setup could barely keep up with one parameter sweep; our replacement runs full 2D sweeps overnight and exports publication-ready datasets.

Laser control software

We've written several laser control applications, from single-laser drivers with a clean GUI to multi-laser systems where beams have to be steered, shuttered and intensity-modulated in synchrony. Typical features: real-time power monitoring, safety interlocks with visual state indication, programmable pulse sequences, beam position logging, and integration with external trigger sources. In one project we combined a custom driver board (built in-house) with a PyQt control app that replaced three separate vendor tools the lab used to juggle. Setup time for an experiment dropped from 40 minutes to under five.

What this looks like for you

If you're running a scientific setup where:

  • MatLab or Python scripts have become the project's biggest technical debt
  • Only one person in the lab actually knows how to run the experiment
  • Data ends up in a tangle of CSV files nobody can trace back to conditions
  • You want a real application with a UI, error handling and proper logging
  • You need to combine instruments from different vendors into a single workflow

...then custom scientific software is exactly what we build. We understand the instruments, we understand the physics and biology context, and we know how to turn a fragile research rig into a proper measurement platform.

The best outcome we've delivered: a postdoc came back from a conference, sat down at the new control app, and ran the experiment without ever touching a line of code. That's what production-grade scientific software should feel like.

Got an experiment that's outgrown its MatLab origins? Let's talk and we'll help you figure out the right path forward.