Open hardware accelerates the process of science and addressing challenges

Open hardware machines for science

This blog is part of a series on open hardware and key messages for public policy. Read the introduction and access other #OHpolicy blogs here.

By Nadya Peek, Machine Agency, University of Washington

In Kurt Vonnegut’s satirical Hocus Pocus, a college professor fired for socialist views is making ends meet teaching at a local for-profit prison. Reflecting on a rapidly unraveling society, the character notes:

“Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance.”

Hocus Pocus is dark, and full of dubious characters. We might hope that, outside of fiction, we are doing better than a solely profit-driven and racist society with decrepit infrastructure. Yet the flaw Vonnegut points out seems quite unresolved — especially in science.

Science is grappling with many failures. The general public is distrustful. Incentives are perverse. There is a crisis of reproducibility. Further, while a child’s image of a scientist might be of someone in a lab coat testing hypotheses by conducting experiments with beakers of colorful liquids, this picture fails to show the extreme complexity of contemporary science.

For example, a scientist working on solar cell efficiency cannot just mix liquids together in the hopes of stumbling across a more efficient crystal structure for photovoltaics. The number of experiments they would have to do before they might see improvements could be endless. Instead, such a scientist now might rely on insights for promising areas of exploration from their colleague’s physics-based simulations — or another colleague’s insights from using machine learning with existing datasets. The number of experiments to conduct once they have identified a promising area is still so vast that they need to rely on many skilled technicians working in well-appointed labs to perform them. With enough time, this process works — solar cells were 4% efficient a decade ago and are 25% efficient now — but it is tremendously complex.

Succinctly: knowledge is not gathered by isolated individuals, but by collaborative teams who are constantly seeking and trading players.

Working in these large ad-hoc teams requires coordination, communication, and co-developing workflows all teammates can trust. Science happens in the details: thoughtful experiment design, data collection, and analysis. But when the rubber hits the road, some of these details can be deeply frustrating — -including experiments failing for logistical instead of scientific reasons, infuriating equipment interfaces for expensive machines, endless accounting paperwork, “cloud” data management, and toxic publishing culture.

The process of science moves quickly when a team can rely on a solid infrastructure for collaboration. Ideally, our scientist considering solar cells can quickly design and perform experiments based on data their colleagues collected, then share their experimental results back without getting bogged down by compatibility issues and technical problems. Shared trustworthy infrastructures of open source hardware, software APIs, and data standards help avoid possible pitfalls. This encourages reuse and improves reliability.

Standardization, open source, and robust infrastructure underpins the modern communication systems that make it possible for all of us to browse, stream, and videochat. We know that open source has been good for internet infrastructure, where millions of people do similar tasks. However, in science, there are wild differences in tasks based on the field of science, the experimental methods, and what constitutes knowledge. Making crystals for solar cell production is very different from measuring how plankton travels in the deep sea, which is very different from identifying lipids in microbial cell membranes, which is very different from analyzing song patterns in migrating birds.

Therefore, unlike the internet, any science task might need highly customized infrastructure.

Open source hardware allows for community development of reliable core designs that scientists can tailor for their own custom research applications. Scientists working together across disciplines can build robust collaboration mechanisms that work across hardware, software, and protocols using trusted standards and APIs. Scientists who can invent and improve their tools can not only make science faster and cheaper, but can fundamentally reinvent how we do science in the first place.

Open, tailorable, and extensible systems make results more repeatable, less expensive, and ultimately speed up the process of science. This broadens participation from groups who cannot otherwise manage the overhead associated with complex collaborations. Creating incentives to adopt, develop, and maintain such open systems is therefore critical for the future of science.

We are facing wicked problems related to energy, the environment, and inequality. The complexity of these problems dictates the complexity of the science we need to address them. We do not have time to waste. If scientists are encouraged to work together on their shared infrastructure, it will be easier for them to work together on our shared planet. Open source hardware is one way to win some time.

Journal of Open Hardware, an Open Access initiative run by the Global Open Science Hardware community and published by Ubiquity Press.