Straddling our system for sharing data are the tools to gather and analyze it
Not sure where else to put this, so I'm putting it here. Since I've been trying to integrate things together with Bruker Microscopy's software Prairie View with Python and Arduino (with varying success), I've been talking a lot with some of Bruker's employees. One of the support members said that they get questions daily about how to use their software from Python/MATLAB but don't have examples to share with people. Presumably they don't have much time/resources to build out more complete examples using open-source tools or pipelines since they're so busy building the hardware involved. The documentation for how to actually use their tools with external software is somewhat bare. This is something I've noticed in my (super limited) experience in science: Companies get really good at building the tool and then leave it to you as the experimenter to analyze the data and interface with the machine (if it's even possible). It's strange that many companies aren't that involved with the analytics side given their enormous expertise in how the data is produced. I'd imagine they also have some best practices for analyzing data too that are in silos of their own. What do you think we can do to make it so companies out there are more engaged with open source development that's based off using their tools? Is that even a good goal to have?