Wireless power: Difference between revisions

From Helpful
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
Line 80: Line 80:
Range of each of these may be mentioned as 10 meters.
Range of each of these may be mentioned as 10 meters.


However, this is a "its effects are measure to be above zero" range.
However, this is a "you can ''maybe'' tell a transmission device is nearby" range.
If it manages to transfer anything at all, it most likely has ''absolutely terrible'' efficiency.
At that range it is not going to transfer any real power,
You should assume it needs to be nearly touching to be halfway okay efficiency.
and if it does anything at all it most likely has ''absolutely terrible'' efficiency.
 
Assume it needs to be near-touching to be halfway okay efficiency.




Line 92: Line 89:


It turns out there is usually a very real difference between  
It turns out there is usually a very real difference between  
: ideal hardware in lab condtions - Minimal distance. Expensive hardware. Good shielding. Good alignment.
: ideal hardware in lab conditions - Minimal distance. Expensive hardware. Good shielding. Good alignment of transmitter and receiver.
: ...and what you will see in reality.
: ...and what you will see in reality.




Assume that that quoted figure from lab conditions, transported to the real world, will actually be noticeably less.
Assume that that quoted figure from lab conditions, transported to a real-world device, will actually be ''noticeably'' less.
 
Real-world data (including some official graphs from manufacturers) and amateur tests (e.g. charging a phone wired versus wireless)
suggests efficiency peaks out at 60% to ''maybe'' 70%.
From an engineering standpoint, that's not too bad.


Yet that's trying to get it to work decently.
Real-world data (including some official graphs from manufacturers) as well as amateur tests (e.g. charging a phone wired versus wireless) suggests efficiency peaks out at 60% to ''maybe'' 70%, which decreases with distance, poor alignment, and cheap designs.
That increases with distance, poor alignment, and cheap designs,


So at best, you're going to use ~40% more power than just plugging it in.
From an engineering standpoint, that's peak is actually pretty decent.


It's not too hard to en up with twice the power of plugging it in would take.
From an efficienty standpoint, you are typically using ~40% more power compared to just plugging it in in a good case, and maybe twice in a bad case.

Revision as of 09:02, 4 March 2024

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.




tl;dr:

  • not very much power
  • not very long distance
  • only moderately efficient, lessened with range

As such,

  • it's not used on beefier devices,
  • it has mostly ended up as a convenience e.g. in mobile devices.



There are three major variants:

  • Wireless Power Consortium (WPC)
e.g. Qi
frequency: ~110-200kHz
power: order of 5 Watts (planned higher) (verify)
inductive
  • Power Matters Alliance (PMA)
frequency: ~300kHz (277 kHz to 357 kHz?)
power: 3.5W to 15W, planned higher? (verify)
inductive
  • Alliance for Wireless Power (A4WP)
e.g. Rezence
frequency: ~6.6MHz
power: order of 5 Watts (planned higher)
range: up to 5 cm (for decent efficiency; can work at more), allowing e.g. under-desk mount
resonance



On range

Range of each of these may be mentioned as 10 meters.

However, this is a "you can maybe tell a transmission device is nearby" range. If it manages to transfer anything at all, it most likely has absolutely terrible efficiency. You should assume it needs to be nearly touching to be halfway okay efficiency.


On efficiency

It turns out there is usually a very real difference between

ideal hardware in lab conditions - Minimal distance. Expensive hardware. Good shielding. Good alignment of transmitter and receiver.
...and what you will see in reality.


Assume that that quoted figure from lab conditions, transported to a real-world device, will actually be noticeably less.

Real-world data (including some official graphs from manufacturers) as well as amateur tests (e.g. charging a phone wired versus wireless) suggests efficiency peaks out at 60% to maybe 70%, which decreases with distance, poor alignment, and cheap designs.

From an engineering standpoint, that's peak is actually pretty decent.

From an efficienty standpoint, you are typically using ~40% more power compared to just plugging it in in a good case, and maybe twice in a bad case.