Is it time to update the Open Hardware Definition for AI or make a parallel one for AI?

Is it time to update the Open Hardware Definition for AI or make a parallel one for AI?

Hey open-source hardware makers! It might be time to update the Open Hardware Definition, it’s over 10 years old (here is my post on MAKE Magazine from February 10th, 2011). A lot has changed in the last 10+ years for open-source hardware and open-source software, and some things have not! There was/is an Open Source Hardware (OSHW) Definition 1.1 draft, but has not been updated on the wiki since December 10, 2018.

While there are a few things I’d update on the Open Hardware Definition 1.0 my focus is on adding something to address AI/ChatGPT/humans working with LLMs, etc. I’m going to propose the same thing that I’ve attempted to have the Open Source Initiative consider for their OSI + AI license / definition. Here’s a blog post about that as well. The goal is sharing which exact tools were used and in what ways to allow others to replicate (and iterate) with AI/LLMs, etc. it’s a little different than commenting code, or publishing code under an open-source license, but the intent can be the same.

My addition to the definition is specific to this “freedom”

“Study how the system works and inspect its components.”

The OSI + AI definition at this time leaves out the inspection of prompts and data access transparency, so here’s the proposed addition to the OSI + AI definition, and an update Open Hardware Definition or a parallel definition that is specific to AI.


Inspection of Prompts and Data Access Transparency:

In addition to the existing requirements, the preferred form for making modifications to a machine-learning system shall include access to the prompts and commands used during the training phase and/or code and hardware creation. This will enable users to understand the context in which the model was developed, including:

  • Prompt Transparency: Access to a detailed log of all prompts, commands, and instructions used during the training phase and/or code and hardware creation, ensuring that users can see the exact inputs that shaped the model’s behavior.
  • Justification and Documentation: Each prompt should be accompanied by documentation explaining its purpose, how it was constructed, and its expected impact on the model’s development.
  • Replicability and Testing: The framework should provide means for users to replicate prompt scenarios to test modifications and understand their effects on the model’s outputs.
  • Prompt and Model Linking: Direct links to the specific model versions used along with the corresponding prompts, enabling a traceable lineage from input to model behavior.
  • Timestamp and Metadata Documentation: Each entry of the prompt log should be timestamped and include metadata such as the version of the model used at that time.
  • Public Access to Logs: Where possible, logs of the prompts should be made publicly available, with links provided in the documentation to ensure that users can review the historical context and development trajectory of the model.

This addition aims to enhance transparency and foster an environment where users can more effectively audit, replicate, and modify AI behavior.

And of course we have a real-world example, we’ve been doing this for about 1 year! Check out our video, and article “Writing an Arduino driver with OpenAI ChatGPT and PDF parsing” and here’s an example of the prompt transparency when publishing open-source code.

What’s next? Proving that there is enough demand within the open-source hardware community to actually try and update the Open Hardware Definition for AI , either as a revision of the existing definition or a parallel one for AI. If there is, we can figure out what a legitimate process would look like and how it would work. I’ll email OSHWA, hit the forums / various Discord(s), and email open-source hardware makers.

If there isn’t any interest in an update, I will probably publish Open Hardware Definition for AI which would be the current definition, with the AI additions and since AI also stand for Adafruit Industries, I suppose it would be what Adafruit uses when Adafruit refers to open-source hardware and AI was used in some way transparently and others can adopt it over time (or not). I guess we’d need a logo too.

Related:

*header image background question marks made with DALL-E 2 and GPT-4o Aug 29, 2024.


Adafruit publishes a wide range of writing and video content, including interviews and reporting on the maker market and the wider technology world. Our standards page is intended as a guide to best practices that Adafruit uses, as well as an outline of the ethical standards Adafruit aspires to. While Adafruit is not an independent journalistic institution, Adafruit strives to be a fair, informative, and positive voice within the community – check it out here: adafruit.com/editorialstandards

Join Adafruit on Mastodon

Adafruit is on Mastodon, join in! adafruit.com/mastodon

Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.

Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Join over 38,000+ makers on Adafruit’s Discord channels and be part of the community! http://adafru.it/discord

CircuitPython – The easiest way to program microcontrollers – CircuitPython.org


New Products – Adafruit Industries – Makers, hackers, artists, designers and engineers! — New Products 9/4/2024 Featuring Raspberry Pi Pico 2 – RP2350! @adafruit

Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: Diving into the Raspberry Pi RP2350, Python Survey Results and more! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi

EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey

Adafruit IoT Monthly — IoT Vulnerability Disclosure, Decorative Dorm Lights, and more!

Maker Business – Adafruit Daily — A look at Boeing’s supply chain and manufacturing process

Electronics – Adafruit Daily — Function Generator Outputs

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !



No Comments

No comments yet.

Sorry, the comment form is closed at this time.