No problem
I think that’s a tale as old as automation. You have the people that buy the automation, and then the people that actually use the automation. And at most companies that don’t have a dedicated automation team (or at least a person who knows what they are doing) the people that buy and the people that use are different.
Management will see these instruments at a conference or LinkedIn or through a sales person, hear a few buzz words and decide that $150k ain’t that bad to dip the company’s feet into automation, afterall, it’s code free, anyone can use it and this will look great at the annual review meeting!
The users in the lab then get insufficient training and/or aren’t given time to sit down and properly write/test/use the instrument and it becomes a $150k paperweight destined to be bought up by Copia Scientific for 1/10 the purchase price, where they sell it at 5X profit to one of us who loves it because X/Y/Z, it’s fit the niche you needed and you got a great discount from the original purchase price.
Unless Tecan has implemented a plain language AI interface, I don’t see how they can create a “code free” automated liquid handler. Unless your task is reagent distribution or something and you want users to write a script per volume they want to distribute (because otherwise you’d make the volume to distribute a variable, which involves code).
Color me jaded and unimpressed.
Or they get sufficient training and are immediately worth 3x their salary at other positions, even including field apps for the instrument mfr. Playing devil’s advocate, maybe the script-free model and pre-loaded applications are meant to combat this. On the same note, this would be easier for clinical environments and would reduce turnaround time from purchase to utilization.
Also, am I crazy or does it look like the Veya has different channel tips than the FCA/Liha and maybe more than 8 channels? I can’t find any higher resolution images on the slide deck to confirm.
Wow, great conversation.
All of these things can be and are true in this industry. We’ve all seen programmatic software & robots that are scrambling to build GUI’s, low code/ no code interfaces BUT also GUI first, low code/ no code software & robots that are scrambling to make themselves more programmatic.
Judging from the look of it, this looks like Tecan MAPlinx software product which can be customizable.
It’s good to see something in the $ price range from Tecan. Increased market competition in this range produces better products for all of us, novice or expert.
@evwolfson I agree, it looks more akin to their customized systems which probably come with less bells and whistles than the AirFCA arm.
I’ve been at Synthace for 6 years and we’ve been banging the drum on no-code interface so it’s great to see the industry now picking up on this making their instruments much easier to use.
I’ve spoken to many scientists since my time at Synthace and the main thing we hear from people who have automation is how difficult it is to use it. Scientists are not engineers, their priority is to push the boundaries of science and not script. Instruments collect dust, and a lot of the time, scripts would probably have been written by people who already left the company and didn’t do a handover or write up notes on how to use the automation.
I think the git controllable code should still be used especially in validation and with QC protocols, but with R&D and for a scientist to quickly make changes do you really want to have someone review in git for a volume change or a plate change? Yes for QC but for R&D this slows things down and just puts scientists off automation entirely and they go back to doing things manually.
So I personally think this is going to be a great addition to the entry-level automation less than 6 figures. Early startups won’t have automation engineers and probably just a bunch of scientists who can readily use these instruments. When they expand, and prove out their POC, they can add Fluents etc and that’s when they can think about bringing automation engineers to look after them, who will then manage the tech stack.
Ah interesting to get others perspectives!
I do think one of the main problems with the statement “Instruments collect dust, and a lot of the time, scripts would probably have been written by people who already left the company and didn’t do a handover or write up notes on how to use the automation.” is the knowledge gap to get a script with bespoke programming language up and running along with requiring an IDE that you can’t easily google for help. I’d say it is hard due to the esoteric IDE’s and languages implemented. There’s no standard really, you can’t apply best programming practices or google best programming practices, until this forum there wasn’t even search results. SLAS talks are behind a paywall and hard to find. For example why should a cherrypicking script need a write up on how it works, if this is programmed in plain python with correct doc strings and linted I doubt you’d need a whole document describing such small programs. Lots of these scripts we write are pretty small in the scheme of things when compared to complex software integrations. Maybe I’m just starting to write the sales pitch of OpenTrons and pylab robot at this point …
I think you do tbh.
I agree, I didn’t want to get granular but before using proper git work flows I can’t tell you how many times random volumes got changed due to other constraints only during that run, that did not get reset after a run.
This topic could probably be a whole topic on it’s own “The state of lab automation”. Maybe I’m unaware but it really seems like this industry is in a niche compared to other software industries and applications.
Don’t get me started for the sales pitch on Synthace
Now’s your chance! LOL
I agree with everything you and others have said.
Honestly, my slightly controversial, personal feeling is that no-code/low-code platforms are net-negative on the lab-automation sphere. Sure, there are some use-cases where they are helpful, but I’d much rather that companies like Tecan, Hamilton etc actually focus on DX and provide an environment where proper programming and automation practices can be carried out. Things like VCS, debugging etc. And yes, even in R&D. If anything, even more reason to have those in R&D so that all the changes can be tracked and correlated.
I don’t think companies should be empowering every scientist to configure automation. I think companies should be empowering people like us to configure automation for every scientist. Those two things are different and need different things
If you look to other industries, I fear lab automation is lagging behind. Look at software dev, car industries, food/agriculture industries, for example.
I love a spicy take! With that said, do you feel that no-code/low-code automation can also help drive adoption of more complex automation?
…I fear lab automation is lagging behind. Look at software dev, car industries, food/agriculture industries, for example.
I want to say that I think we as lab automation folks put these industries on a pedestal and sure there’s a gap right now but I think the top 1% of lab automation groups are not too far behind. They probably employ a similar tech stack and use similar communication standards. It’s also important to remember that software dev, car industries and food/ag industries just have on average significantly higher profit margins so these become targets for new tools and the test beds for technologies before they trickle down to other groups and then consumers. That’s a largely a market driven phenomenon.
However if you look, there are thousands of horror stories that are just as bad as lab automation. Respectable, large companies hoisting their backend data into Google Sheets or Excel files. The grass is always greener…
Personally I don’t see it enabling either. Even when trying the low code solutions, they have always in the end required an automation engineer to reconfigure the software or at least help with the liquid class dev. Then most of the time you can’t get the complex functionality needed out of a low code solution and it becomes a paper weight.
Here’s my hot take: if Tecan or Hamilton adopted a python based environment they would immediately become the leader by a long shot dominating the space. They both have almost equivalent hardware, give or take. Combined with AI code support, co-pilot/chat-gpt I would argue that scientists are more enabled to write their own code than some esoteric GUI based language that has no support and limited functionality.
I feel like a cool implementation would be node based, similar to touch designer or cables.gl. Where you can work in a node based workflow but it compiles down into python/javascript. This also allows the scientific user to write programs in node based and engineers to just write it in python if they prefer.
Edit: I just remembered about Hamilton InstinctV (rip) which I completely forgot was node based and compiled to XML I think… I do remember it never really worked. So someone somewhere did have the same idea…
I think you are exactly right about the value in being able to translate between no-code GUIs and code.
There’s immense value in having an interface that a scientist can walk up to and write a workflow very quickly without having to know how to code. Learning the fundamentals of coding takes a long time, and requiring that as a prerequisite to do automation is going to leave a lot of capable scientists out of the equation.
I think the fundamental utility of code-based interfaces above every other consideration is that you can freely create your own abstractions and tools that seamlessly integrate with the liquid handler. There’s no obvious contradiction between GUIs and code, since necessarily every GUI is built with code. Allowing people to build their own custom GUIs will lead to a lot more solutions than could possibly be implemented by a manufacturer. Even the biggest robot vendors have far less than 100 software developers. That’s probably less than the number of people who can write automation software who will log onto this forum today.
Also Im going to split this topic into two because the conversation has diverted. I want people to be able to find Tecan Veya information easily and to also find information about the no-code discussions easily.
A post was merged into an existing topic: Veya (New Product)
It’s interesting to me that all these companies are going “less programming needed” when if you look at this forum everyone is aiming for real IDE’s, git controllable code, and production environments. Maybe it’s an anti-pattern that the loudest market segment is non-programming users reaching out to marketing of Tecan, Hamilton, Agilent. Where most of us that can take control of our code do so by finding hacks; then work inside the system to make semi usable production environments.
I distinctly remember walking up to the Agilent booth at SLAS and a sales rep being “with our new GUI’s you don’t have to write a single piece of code”, which made me immediately question how anyone unit tests anything produced or version controls their assays! I also remember 2 years ago at SLAS the most attended talk was on how to implement git and breakdown Hamilton code into a production environment, so who is driving this? Is this what the market really wants!?
A post was merged into an existing topic: Veya (New Product)
Nah.
This forum skews hard one way for sure (that’s why I am one of the most active posters) but these convos are more nuanced because business decisions tend to be nuanced. Some companies need to start somewhere. Even cloud services have no code low code options for a reason. (Good example of this is moving from zero to implementation with Benchling compared to a more customizable LIMS/ELN.) I’ve also talked to plenty of founders and companies that start with something as simple as an OT-2 or dispenser, and then use that to launch into more complex automation.
There’s definitely room for both. If people really want something, they need to advocate for something with those companies. It’s amazing what consumers can do with momentum.
However since you use Tecan I want to add that Tecan has given people the option to program their Fluent in C# or Python (just wrapper anyway like everything) and that’s been available for years. They even have actual API Interfaces that allow you to build your own GUI’s on top of them and you can even use a communication standard (SiLA2) if you potentially want to use their systems and integrate them into your own custom environments. In fact, there’s further active development on it. And their Driver Framework now actively supports SiLA2 from the jump. They’re also adopting modular frameworks that may take years to develop. These are all signals of positive intent !!! And yet, they’re still not the “leader by a long shot dominating the space”.
There’s a hard disconnect between what people want and what they vote for with their wallets.
I think hardware (in general) really could improve here as well. My ideal state for this sort of work would be low-code program → simulate → run samples. No testing. No plate definitions. No liquid class tinkering. It just works (barring user error).
As a bridge to this it’d be really cool to see a platform that does automated testing and parameter setting. This sort of workflow might look like a low-code program → simulate → first time run → user-samples where it prompts the user to put appropriate labware there, representative liquids, and then it runs through a series of tests to determine what the labware definitions should look like and how various parameters are set.
You may be interested in the Floi8. It does auto liquid classes and has a camera to see if the correct labware is on deck. Along with force sensors in x&y for labware teaching when you want to use random labware. In practice though the low code environment made it not possible to control all the liquid class parameters when it did get the automatic settings wrong.