Skip to main content
added 504 characters in body
Source Link
Marcus Müller
  • 53.3k
  • 4
  • 80
  • 123

The DDC standard says that hardware must support 100 kHz clock rate, and that only the graphics adapter can be the (clock) master of the I²C bus there. So, your device needs to be able to work at any speed of 100 kHz and below, including I²C-funky things like clock stretching. (Never implement I²C yourself in software. If your microcontroller has no I²C peripheral, use a different microcontroller, or use a different bus.)

If I might have one word of advice:

Don't.

I used to do technical support for an embedded device that had its own EEPROMs on the same I²C bus that was also used for the mini-HDMI header it had in the front.

Sometimes (as in: multiple independent clients of mine) using the graphical user interface while the hardware backend was querying hardware properties from the internal EEPROMs led to the EEPROMs misinterpreting replies from the EDID-containing screen memory as write commands, due to colliding timing of bus interactions.

And that was with devices that were specialist devices, where such bus transactions would not happen more than once a minute, maybe. Extrapolate to how many transactions your controller will cause, and how likely it is that you'll clash with the actual function of DDC and make a user have a really bad day.

Graphics card drivers often just bitbang I²C buses, and they can't be aware of multi-transfer transactions on the I²C bus.

Don't try to hang a game controller off screen interface. That will only end in tears, it requires very questionable privileges for any software that needs to interact with the controller, and I²C-to-USB adapters are < 2€ at even the most expensive but trustworthy electronics distributors; no way your "I buy a display connector, find an unused graphics card port or build a pass-through device that still works although video signals are extremely high-bandwidth, sensitive signals" comes even close to the cost of that.

If you're building a game controller for a 1980's console, maybe consider the specific buses that these consoles had – you'd be writing very custom low-level software for them anyways.

More realistically, just make your custom game controller's microcontroller speak USB directly. The Human Interface Device class is about the simplest thing you can implement in USB. All relevant microcontroller families you'd use for anything that's not going to be produced by the millions or needs to persist on microwatts contain members that have USB hardware built in. Literally any microcontroller has a UART interface, if you need something you can plug into the back of your original IBM PC.

The DDC standard says that hardware must support 100 kHz clock rate, and that only the graphics adapter can be the (clock) master of the I²C bus there. So, your device needs to be able to work at any speed of 100 kHz and below, including I²C-funky things like clock stretching. (Never implement I²C yourself in software. If your microcontroller has no I²C peripheral, use a different microcontroller, or use a different bus.)

If I might have one word of advice:

Don't.

I used to do technical support for an embedded device that had its own EEPROMs on the same I²C bus that was also used for the mini-HDMI header it had in the front.

Sometimes (as in: multiple independent clients of mine) using the graphical user interface while the hardware backend was querying hardware properties from the internal EEPROMs led to the EEPROMs misinterpreting replies from the EDID-containing screen memory as write commands, due to colliding timing of bus interactions.

And that was with devices that were specialist devices, where such bus transactions would not happen more than once a minute, maybe. Extrapolate to how many transactions your controller will cause, and how likely it is that you'll clash with the actual function of DDC and make a user have a really bad day.

Graphics card drivers often just bitbang I²C buses, and they can't be aware of multi-transfer transactions on the I²C bus.

Don't try to hang a game controller off screen interface. That will only end in tears, it requires very questionable privileges for any software that needs to interact with the controller, and I²C-to-USB adapters are < 2€ at even the most expensive but trustworthy electronics distributors; no way your "I buy a display connector, find an unused graphics card port or build a pass-through device that still works although video signals are extremely high-bandwidth, sensitive signals" comes even close to the cost of that.

If you're building a game controller for a 1980's console, maybe consider the specific buses that these consoles had – you'd be writing very custom low-level software for them anyways.

The DDC standard says that hardware must support 100 kHz clock rate, and that only the graphics adapter can be the (clock) master of the I²C bus there. So, your device needs to be able to work at any speed of 100 kHz and below, including I²C-funky things like clock stretching. (Never implement I²C yourself in software. If your microcontroller has no I²C peripheral, use a different microcontroller, or use a different bus.)

If I might have one word of advice:

Don't.

I used to do technical support for an embedded device that had its own EEPROMs on the same I²C bus that was also used for the mini-HDMI header it had in the front.

Sometimes (as in: multiple independent clients of mine) using the graphical user interface while the hardware backend was querying hardware properties from the internal EEPROMs led to the EEPROMs misinterpreting replies from the EDID-containing screen memory as write commands, due to colliding timing of bus interactions.

And that was with devices that were specialist devices, where such bus transactions would not happen more than once a minute, maybe. Extrapolate to how many transactions your controller will cause, and how likely it is that you'll clash with the actual function of DDC and make a user have a really bad day.

Graphics card drivers often just bitbang I²C buses, and they can't be aware of multi-transfer transactions on the I²C bus.

Don't try to hang a game controller off screen interface. That will only end in tears, it requires very questionable privileges for any software that needs to interact with the controller, and I²C-to-USB adapters are < 2€ at even the most expensive but trustworthy electronics distributors; no way your "I buy a display connector, find an unused graphics card port or build a pass-through device that still works although video signals are extremely high-bandwidth, sensitive signals" comes even close to the cost of that.

If you're building a game controller for a 1980's console, maybe consider the specific buses that these consoles had – you'd be writing very custom low-level software for them anyways.

More realistically, just make your custom game controller's microcontroller speak USB directly. The Human Interface Device class is about the simplest thing you can implement in USB. All relevant microcontroller families you'd use for anything that's not going to be produced by the millions or needs to persist on microwatts contain members that have USB hardware built in. Literally any microcontroller has a UART interface, if you need something you can plug into the back of your original IBM PC.

Source Link
Marcus Müller
  • 53.3k
  • 4
  • 80
  • 123

The DDC standard says that hardware must support 100 kHz clock rate, and that only the graphics adapter can be the (clock) master of the I²C bus there. So, your device needs to be able to work at any speed of 100 kHz and below, including I²C-funky things like clock stretching. (Never implement I²C yourself in software. If your microcontroller has no I²C peripheral, use a different microcontroller, or use a different bus.)

If I might have one word of advice:

Don't.

I used to do technical support for an embedded device that had its own EEPROMs on the same I²C bus that was also used for the mini-HDMI header it had in the front.

Sometimes (as in: multiple independent clients of mine) using the graphical user interface while the hardware backend was querying hardware properties from the internal EEPROMs led to the EEPROMs misinterpreting replies from the EDID-containing screen memory as write commands, due to colliding timing of bus interactions.

And that was with devices that were specialist devices, where such bus transactions would not happen more than once a minute, maybe. Extrapolate to how many transactions your controller will cause, and how likely it is that you'll clash with the actual function of DDC and make a user have a really bad day.

Graphics card drivers often just bitbang I²C buses, and they can't be aware of multi-transfer transactions on the I²C bus.

Don't try to hang a game controller off screen interface. That will only end in tears, it requires very questionable privileges for any software that needs to interact with the controller, and I²C-to-USB adapters are < 2€ at even the most expensive but trustworthy electronics distributors; no way your "I buy a display connector, find an unused graphics card port or build a pass-through device that still works although video signals are extremely high-bandwidth, sensitive signals" comes even close to the cost of that.

If you're building a game controller for a 1980's console, maybe consider the specific buses that these consoles had – you'd be writing very custom low-level software for them anyways.