Conversation
|
CRAZY COOOOL! this is VERY complete |
There was a problem hiding this comment.
Pull request overview
This PR adds support for the Tecan Spark plate reader by implementing a comprehensive async backend with USB communication, packet parsing, and data processing capabilities. The implementation includes controls for various instrument subsystems and processors for absorbance and fluorescence measurements.
Key Changes
- New async USB reader with packet parsing for Tecan Spark devices
- Absorbance and fluorescence data processors with real-world test coverage
- Comprehensive control modules for plate transport, optics, sensors, measurements, and system configuration
- Utility function for calculating non-overlapping well rectangles
Reviewed changes
Copilot reviewed 23 out of 24 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| pylabrobot/plate_reading/utils.py | Adds helper to find non-overlapping rectangles covering plate wells |
| pylabrobot/plate_reading/tecan/spark20m/spark_reader_async.py | Implements async USB communication layer for Spark devices |
| pylabrobot/plate_reading/tecan/spark20m/spark_packet_parser.py | Parses binary measurement data packets from the device |
| pylabrobot/plate_reading/tecan/spark20m/spark_processor.py | Processes raw measurement data into absorbance/fluorescence values |
| pylabrobot/plate_reading/tecan/spark20m/spark_backend.py | Main backend integrating all components for plate reading operations |
| pylabrobot/plate_reading/tecan/spark20m/controls/*.py | Control modules for device subsystems (optics, sensors, movement, etc.) |
| pylabrobot/plate_reading/tecan/spark20m/*_tests.py | Unit tests for reader, processor, and backend components |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
pylabrobot/plate_reading/tecan/spark20m/controls/system_control.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
is this ready for review? |
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove AutoReaderProxy, integrate reading into send_command - Remove reading context manager (no longer needed) - Rename control classes to follow PEP8 (BaseControl, PlateControl, etc.) - BaseControl now takes send_command directly instead of reader - Make init_read and get_response private methods - Add io.binary.Reader utility class Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Code review changesSimplified the Spark control architecture:
|
# Conflicts: # pylabrobot/io/binary.py # pylabrobot/io/usb.py
fc9cf36 to
b56ffdd
Compare
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace custom byte parsing helpers (_read_bytes, _read_u8, _read_u16, _read_u32, _read_string) with the Reader class from pylabrobot.io.binary. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use pytest.approx() instead of exact equality for float comparisons in test_process_real_data tests. Compare row by row since pytest.approx does not support nested data structures. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add max_workers parameter to USB class constructor with default of 1 to maintain backward compatibility. SparkReaderAsync passes max_workers=16 for its specific needs. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The PR fixed the timeout conversion for the main write but missed the zero-length packet write. PyUSB expects timeout in milliseconds, so timeout must be multiplied by 1000. Also add max_workers parameter to USBValidator for signature compatibility. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace manual calculation of well spacing with the new item_dx and item_dy properties from ItemizedResource. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
instead of returning False on error, can we raise the error in send command? this is a more pythonic way of dealing with errors |
The previous implementation used an attempt counter (default 10000) which could spin for 100+ seconds with 10ms sleeps. Now uses time.monotonic() to enforce a wall-clock deadline (default 60 seconds), making the timeout behavior predictable and bounded. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use _safe_div() for the (raw_ref_signal - ref_dark) denominator to return NaN instead of raising ZeroDivisionError when reference signal equals reference dark value. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
MirrorCarrier, LaserPowerState, and LightingState were already defined in spark_enums.py. Import them instead of redefining. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…trol These enums are already defined in spark_enums.py. Import them instead. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add explicit None check for inner_mult_index in _parse_nested_mult call - Import ScanDirection from spark_enums instead of re-exporting through measurement_control Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
is it a problem that |
|
how does it look now? |
|
this uses the io is it impossible to use add an |
…ark_reader_async.py
|
in |
|
|
…read_data endpoints - Introduce SparkError exception so RespError packets propagate to callers instead of being silently swallowed by broad except handlers - Add read_data endpoint keys to PLATE_TRANSPORT and LUMINESCENCE device entries in DEVICE_ENDPOINTS - Update test to assert SparkError is raised on RespError
…ssing device - Add dynamic USB endpoint discovery via _discover_endpoints() to support different Spark hardware variants (e.g. 0x81 vs 0x82 bulk-in endpoints) - Add test_spark_hardware.py with connection, measurement, and transport tests - Add SPARK_SPECTROMETER to SparkDevice enum - Add require_device decorator for early failure when hardware is not connected - Fix unused variable lint warning in send_command
- Add missing human_readable_device_name arg to USB() constructor - Set up mock devices dict in test so device-connected checks pass Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Fix ruff formatting in usb.py (lambda line wrapping) - Guard usb.core/usb.util imports in spark_reader_async.py with try/except - Guard SparkBackend import in plate_reading/__init__.py for envs without USB Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
57e2c0a to
e5bc326
Compare
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Match the established pattern from PR PyLabRobot#941: optional dependency errors should be deferred to instantiation time, not silently swallowed at import time. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Mark the Tecan Spark backend as experimental and replace the interactive hardware test script with a proper docs notebook. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
thanks! |
tecan spark WIP. luminescence coming up soon. i had a hard time to get io.usb to work with this, so i'm directly using libusb for now