Start of the article: "My priority at CES every year is to find futuristic new technology that I can get excited about. But even at a tech show as large as CES, this can be surprisingly difficult. If I’m very lucky [...]"
End of the article: "Bosch covered our costs for attending CES 2020."
Also: "After making a minor nuisance of myself, Bosch agreed to give me a private demo"
I mean it's possible they paid for him (and others) and still didn't allow him to get a demo... but it's unlikely. Also, I feel like IEEE can afford to pay the attendance costs for a reporter.
So, this is really just promoted content/an ad.
Which is fine and it's even fine to wait until the end to tell us I think but not if the article body tries to paint a different, organic picture. This wasn't at all necessary for the content but simply there to detract from the paid promo nature of the piece.
"We don't want this getting out just yet, but... oh okay, you can come behind the curtain just this once."
<end of private demo>
"So, what'dya think? By the way, here's a fresh marketing video we happen to have lying around. Tell everyone. No no, actually, let us pay you to tell everyone. We insist."
I wondered this myself but I searched the IEEE website and it appears that there are several stories about various Bosch technologies so it is possible that Bosch was expecting this writer to cover their other stuff and wasn't planning to demo this particular item. Given that Bosch is looking for a commercial partner and not to sell direct to the public they might have had the glasses to show to possible partners and not reporters.
Yeah, that's a big no-no for credibility. At least for me, I have to trust a stranger before I could ever convert on a pitch, and stating any financial obligations or fiduciary duties or conflicts of interest upfront or early on is far better than tacking it on later.
Case in point, I'm not even going to read this article after reading this comment.
>Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.
I quoted what was on the page when I read it. They changed it at least twice, the last time I checked it said something about Bosch Sensortec, and I had to Google it to see if they were related to the glasses. They since apparently added that in as well.
Holy shit. The parent comment is as disingenuous as the implication they’re putting the author of the article. It earned a very rare HN downvote from me.
As an amateur runner and triathlete I really hope someone will integrate this technology into suitable smart sunglasses. I would pay $1000 today for such a product if it actually works. My GPS fitness tracker is a great tool but I hate having to constantly glance down at my wrist to check pace, distance, and heart rate. This is particularly annoying when executing a structured intervals workout based on specific target metrics.
There are existing heads-up display products targeted at cyclists such as the Everysight Raptor and Garmin Varia Vision. However they aren't practical or comfortable for runners.
Ideally I'd like the smart glasses to have the following features: ANT+ Extended Display profile. 6 hour battery life. Lightweight with even weight distribution (not all on one side). Prescription lens compatible.
I don't trust any of the major tech companies to do a good job with this product. How long until your weather app needs access to that always on camera in your glasses?
Practically I also can't imagine how they'll solve weight/battery/networking issues with the device particularly in athletic settings where you're literally putting the device through a constant earthquake.
Maybe we should stop wishing for some ridiculous convenience layer to be added to our lives and just look around during our runs like we've done for thousands of years.
Or (and this is controversial), design a convenience layer around some other sensory input that doesn’t require vision or alienates people with visual disabilities.
Agreed, I am also an experienced runner and this would be wonderful. Assuming it can get GPS and map trails, trail running would be vastly improved by this. Right now people have to paint trees every 20 feet or so to mark the trail, but if you have a live HUD it can tell you if you're on or off course fairly quickly.
Trouble is GPS isn't always reliable in heavily wooded areas, so there might need to be improvements in GPS before that is really viable.
I would actually prefer to not have any GNSS (GPS) receiver in the smart glasses in order to keep the weight down and battery life up. GNSS receiver can still be in a wrist device and just use the smart glasses as a display linked via ANT+.
GNSS accuracy is poor in forests, canyons, and dense urban environments due to line-of-sight obstructions and multipath reflections. The latest Navstar Block III satellites should improve that a little, but any real solution would require deploying additional satellites in geosynchronous orbit like the Japanese QZSS.
I used to be an ultradistance runner and also built and sold an AR/Computer Vision company.
Unfortunately the state of localization isn't ready for passive (aka non-radiative) reliable sub meter localization over anything more than a few meters squared in a well lit indoor space.
Going to be a while likely before we get to "wear everyday" AR glasses with sub meter accurate camera position/localization.
I would maybe buy smart glasses for $1000 if and only if the protocol to interface with them was documented. First because it’s unlikely they’ll support the platforms I like to use, and second because enough of these products have come and gone that I don’t trust they’ll be officially supported for more than two years or so.
Wouldn't this be a total shoe-in for an audio-based interface? Probably I don't quite understand your requirements - I'm imagining you want alerts for when you're within your pace/heartrate targets, and for when you reach specific distance targets or intervals.
Audio alerts don't work very well in practice. My fitness tracker can be configured to play alert tones when I reach my target zone or when various metrics are too high or low, but when I try to rely on those alerts I tend to over correct and bounce off the limits. Plus sometimes the environment is just too noisy and I can't distinguish between different alerts. That makes for a frustrating low-quality workout. Having the actual numbers constantly in the corner of my eye would make the process much smoother and easier.
Also I hate to be "that guy" whose stupid wristwatch is constantly beeping during group workouts and races.
Social ticks such as frantically tapping the side of your head, will be normal when someone tries to "remember" your name. I thought it was weird when people were talking to themselves with bluetooth on or walking into a street sign when looking down at their phone...but it's just going to get stranger. I could easily imagine eye flickering or eye rolling when your brain OS is rebooting. These glasses and also the contact lenses in the works will make us forget having to worry about things we take for granted today just as your phones helped us forget peoples' phone numbers or care about knowing how to get somewhere as the new devices will be contextually aware of facts we need to know that will just appear as an overlay...no more "let me look it up on google". If it's coupled with audio, gps and other inputs, it could be even more proactive in finding things before you even knew you needed them.
I have a bluetooth "joystick" which is so small it uses a ring so that I can hold it. No reason not to improve that. The interface for your walking-around glasses can be a thick ring that you rub and click with your thumb.
This is how Focals by North [0] work. Everyone I’ve asked about it has thought it’s a silly idea (to have to wear a ring), but I still can’t think of a better interface.
I thought it was weird when people were talking to themselves with bluetooth
When Bluetooth headsets became small enough to be inconspicuous I lived in a marginal neighborhood. When my wife and I would go out, we'd play a guessing game called "Bluetooth or mental illness."
I play a game with people talking on their BT device, where I pretend I don't already know that, and I talk back to them as if they're talking to me. The object of the game is to see how long I can keep them from their other conversation by refusing to acknowledge their device.
I'm not sure how it can get stranger. I've had guys come up to the urinal next to me still talking into their ear pieces. It's incredibly rude, bizarre behavior. Their conversation is taking place in an entirely different context, and I think that's where their minds usually are too. So they might not think anything of it when they approach from behind saying something like, "We need to take care of this right now" too loud and too close to your ear.
It seems rude in most contexts, to just shove your conversation into everyone else's life. Unless it's a real emergency, I don't get how people think it's okay.
I think a lot of people don't realize that a lot of people consider it a luxury to be out of contact for periods of time.
When that happens to me, I make sure to let the person on the other side of that conversation know where it's being held.
I flush repeatedly, turn on the water, and take out my phone and have very loud imaginary phone calls, or just cycle through the available ringtones at high volume.
Some people are still so low class that it doesn't even phase them.
I already can't stand talking to people who have those air pods in or are looking at screens. I don't know if social norms around this will change or not, but I hope they don't. No one can have a straight conversation with someone distracted by other stuff. Even if that person has no music on, it still feels as though he's elsewhere.
Judging by the (upper)middle class kids I work with, the social stigma around wearing earbuds while holding a conversation is clearly on the way out. It seems like a rapid social shift (which makes me old at 34, I guess) and is probably an “important” change for those who want to sell us on the AR future.
To that end, my experience of using transparency mode on AirPod Pro earbuds is that they very much do “become invisible” while allowing me to overlay (auditory) information on the world around me. If they were built to be as inconspicuous as my father’s hearing aids, nobody would know the difference and no overt social stigma would persist. The AR future of today is auditory.
Calling AirPods “those air pods”, but also having something as “young” as “big_chungus” as your username... I’m having a hard time figuring out your age!
That is a great point about distraction and not feeling as though you truly are a focal point. It will get very odd if something like neuralink comes into play where the person just gets ideas as if it was their own while they are talking to you, then you will never know if they are actually thinking for themselves at all either on top of not paying you full attention. But if you are a parent with a teenager, you'll kind of already feel this way.
I'm sure we'll develop sensitive clothing, body feedback and other forms of haptic interfaces - a tap on the side of the glasses might not be necessary if you can train your AR to respond to say, snapping fingers or tapping feet.
The thing I hate about this trend is that we seem to love to forego a vital part of interfaces in the process: visibility. I don't know why designers seem to hate buttons, but it's really annoying.
> I could easily imagine eye flickering or eye rolling when your brain OS is rebooting.
That image reminds me of the transhumanist YouTube series H+. In one of the first episodes one of the characters keeps trying to “reboot” after his implant succumbs to a computer virus capable of killing people.
This reminds me of a technology from a couple of decades ago that seemed to disappear: wearable computer monitors. What ever happened to those?
Basically, you wore this thing like eyeglasses on your head, and it had a small arm that extended in front of your eye (but a little below it). When you looked down, it appeared like a computer monitor was hovering in front of you. The very early models were 320x200 resolution in monochrome (red on black), but I tried one at a trade show in 2000 that I think was 800x600 in VGA color, which at the time was pretty decent. I'm surprised these never got more popular; they would be great for laptop computers: you could have total privacy in your viewing (unlike a normal screen), and with improvements in the technology you could potentially "see" a much larger screen than a normal laptop has.
I actually enjoyed my Google Glass, primarily its always-on easy-to-reach nature. Pulling your phone out of your pocket, turning it on, and unlocking it is a trivial action, sure. But putting your finger up to your face is still an order of magnitude faster and easier. As a result, I found myself taking way more photos and having literally dozens more phone calls with friends and family. It was pretty interesting.
What are the safety implications of this? If something goes wrong, will people be blinded? What effect do these lasers have long term on the retina, the lens, and the vitreous humor of the eye?
I did not see any part of the article address these issues.
Given that the lasers are light shining in a straight line, and that they're presumably calibrated to intensities the eye can deal with (otherwise we couldn't see them/they'd be less sharp) there won't be any more long term effects than just normal eye use.
Our eyes are designed to let light in. The power level of lasers that can run all day on a 350mAh battery they share with some electronics is going to be miniscule, like maybe a fraction of a milliwatt of power. A common laser diode available right now that outputs 1mW continuously consumes about 36 mA of power to do that. Three lasers outputting 1mW would consume roughly 363 or 108mA, so a 350mA battery would only power those three lasers for about edit 3.4 hours, with nothing left for the electronics to use.
Less than 1mW is not much, even for a laser, and at visible wavelengths it's not going to transfer enough heat to even measurably change the temperature of the human eye, much less damage it.
But don't lasers need to scan in order to draw a full image (vs just a dot)? Doesn't that necessitate making the actual laser beam brighter than other light entering your eye? Are there any negative consequences to having a bright light scan over your retina quickly vs a dimmer light shining on your entire retina at once? What happens if the scanning mechanism fails and the laser ends up shining on one tiny point in the center of your eye for an extended period of time?
Lots of interesting safety questions unique to this particular display technology.
Depends how the circuit fails. I suspect the pump will fail to lase if it receives too much current, physically capping the laser power sounds like a good design decision with something going right into your eyeballs (unlike CD lasers).
There's going to be a hard limit to the amount of power you can direct through this laser. The actual hardware is going to be responsible for safety in this dimension, not software.
Just look away. It's just a display device, so at most it's going to display confusing pictures.
It's not possible to "hack" more output power into lasers with software changes. Would that it was. You can change the duration of the beam, but you can't pulse the beam without a Q switch in a way that changes the instantaneous power.
Then you get some marketing logo etched into your retina :)
I'm wondering how is the laser scanned over the retina (probably some MEMS chip). What happens if the beam scanning is suddenly stopped in place, wouldn't you get the whole laser power concentrated in a small dot. And it happens so quickly that you won't have time to react.
There may also be some phantom image effects like we used to have on those old TV monitors. Or like we can have when fixating a picture.
What's probably more insidious would be projecting some very lightly superimposed structure. You probably can induce some unconscious cognitive load or nausea. By decreasing the discomfort for example when the user is looking at an ad, you will increase the effectiveness of the ad.
I guess maxing out its power would be comparable to looking at a full-white LCD screen. If the device has an ambient light sensor to adjust its brightness, then circumventing it and setting to full power could be comparable to switching on a bright light in previously completely dark room.
Laser is just a means to make light. You stare at the sun and you will go blind, but the majority of people spend their entire life looking at light for all of their waking hours and die at an old age with eyesight. But to answer your question, yes absolutely. The power will have to be low enough to prevent damage.
This could become big in logistics. For example for order picking.
But also for maintenance crews. Want to know which machine broke down? The glasses will give you directions and will even give you an overview of the maintenance history.
I believe this is not a consumer product. Bosch has some consumer products but they are way bigger in the business market.
And about the laser: it's just light. A laser doesn't mean 'cut through everything'. It all depends on the power. I'm sure Bosch doesn't want to melt your retina.
> And about the laser: it's just light. A laser doesn't mean 'cut through everything'
Yeah, in this context laser means a light source with extremely tight cone, meaning it can render a very tiny point on the retina. Not a ray of death that will penetrate your brain and come out the other side of the skull ;)
Yeah, consumer "AR" as popularly imagined (video overlay) isn't a thing. But even at AR conferences a few years ago there were favorable case studies of AR use by shipyard workers, factory maintenance staff etc.
Maybe this will be the exception, but in general these seem so consumer focused as to be useless. I just want something that can accept some standardized or ad hoc well-documented protocol to do basic raster images or text or something as a baseline. I want something that application developers (and people like myself) can start to hack on and explore where it can go.
I know, I want to put these in ski goggles and wire up Google's maps for ski resorts, the ski resort's lift line times data, and apres ski events info, and the current and upcoming weather conditions.
Yes. So much this. I really just want a super low level API to paint monochrome vectors onto my retina. That's it. I don't want any cruft. I really really hope someone takes the *nix approach to a glasses-hud. I want the hardware to track where my retina is, paint it with a stream of vectors over bluetooth, and have strong hardware guarantees for my ocular safety. Maybe toss one of those nice Bosch IMUs in it as well.
I think AR is more suited to commercial users than consumers anyway. Google Glass and Microsoft's AR solution seem to be playing out that way.
Anecdotally, the only context I would want information beamed right into my line of sight is in work scenarios. All other scenarios I want technology to be in the background as much as possible.
I don't see the AR applications even being the most interesting part of this. A private facial recognition database coupled with "this person's name and a note to self" would be immensely helpful in a lot of situations -- I have bad facial recognition (not full on face-blindness, but inconvenient) and it would be neat to be able to hack on this a little bit. This would also require a camera of some sort, but I'd rather have that not be integrated.
Possibly even "real-world closed captions".
A tap for clock/calendar function would be handy.
Morse code (or other silent, maybe subvocal?) "telepathy" would be interesting as well.
I don't know how convenient or awful these would be in reality, but if the cost were not exorbitant and you weren't locked into a proprietary app ecosphere then it definitely seems like it would be worth a shot.
I think the current uses are mostly commercial because that where you can get enough money to make a bespoke application and people aren't as sensitive about how things look. For a consumer AR application you need a lot more value to overcome how bad these things currently look and building that data out for the world is expensive.
Photons do not get a special "laser" tag added to them by physics. If it's all in the visual wavelengths and at intensities below what we experience every day (sunlight is really bright, our eyes hide from us the number of orders of magnitude difference between even normal night-time artificial light and sunlight), there's no issue.
(However... since I often see this sorta misinterpreted in the wild on the internet, note that is an if-then statement. If the antecedent is false, I make no claim.)
The primary safety concern I have would be met by designing the lasers such that if they are overdriven for any reason, they will physically burn out before outputting enough light to be dangerous. Per the classic Therac-25 [1] case study though, that is one safety feature I absolutely want in hardware. There is no amount of software I would accept to implement that.
I would also additionally stick some fuses into the system, tuned below the threshold where the power will burn out the laser, along with of course building the whole battery system to not be able to deliver enough power to power the lasers to a dangerous level. However, I really want excess power to physically burn out the lasers. (I wouldn't want to find out the hard way that an EMP of some sort can overdrive the lasers.)
For all that I'm laying out safety systems here, I am quite confident that it could be done safely. We trust our lives to much more dangerous systems all the time. I will say that I can't explain to you how you'd audit that safety, though.
> Photons do not get a special "laser" tag added to them by physics.
True, but we usually get a pretty wide spread of light energies. These are likely going to be very specific frequencies, hitting similar areas over and over. I wonder if the retina can get fatigued of specific frequencies.
>Photons do not get a special "laser" tag added to them by physics.
I thought they kinda did. By being very similar wavelength and power, compared to normal light which has all sorts of wave lengths and power. This difference can mean certain rods/cones being stressed more than average and not triggering the sort of fatigue that normal light would cause.
Perhaps the best example of a similar concept, though not with lasers, is looking at a total solar eclipse right before or after the sun is fully eclipsed. There is a small period of time where extremely bright light makes it into our eyes, but not he frequencies that cause pain normally associated with looking at the sun. This means that our default defenses against looking at the sun don't kick in and doing permanent eye damage is extremely easy without feeling any pain as the damage is done.
If the lasers are low-powered enough it should be just the same as looking at a screen, shouldn't it? Either way the same amount of photons are hitting your retina, with a laser they are just aimed more carefully.
I'm mostly in agreement with you, but the part that gives me concern is precisely how precisely the lasers will be aimed at the retina. I doubt anyone ever really looks at a screen in the exact same place (even if you are staring at a specific character) due to eye jitter. After a year or twenty of use are we going to start having burn-in on our retina's similar to plasma screens used as kiosks?
With lasers, the photons are collimated, so they're all going almost exactly the same direction, instead of just being scattered like with normal light sources. Because of this, it doesn't take that much laser power in your eye for your lens to focus it down to a small patch on your retina and burn holes in it. So you'll need very low-powered lasers for this to be safe. But it certainly should be possible.
Lasers are effectively point light sources.
We usually think of them as perfectly collimated light, but that's just a special case, where the point is at infinity.
The consequence of being a point light source is that lenses can refocus the beam, parallel or not, back into a point. Doing so concentrates a lot of energy on a tiny surface. And if that surface is your retina, then that when it becomes dangerous, literally burning a tiny hole into it.
It can never happen with, say, a regular light bulb, or even the sun. It the light source is spread out, the image on your retina, or anywhere else, will be spread out, limiting the energy density. This is an indirect consequence of the second law of thermodynamics called the conservation of etendue.
till you can of course (well maybe not us, but I see it quite realistically for our children, with some perks like taking pictures, better seeing in the dark, maybe infra, zoom)
End of the article: "Bosch covered our costs for attending CES 2020."
I mean it's possible they paid for him (and others) and still didn't allow him to get a demo... but it's unlikely. Also, I feel like IEEE can afford to pay the attendance costs for a reporter.
So, this is really just promoted content/an ad. Which is fine and it's even fine to wait until the end to tell us I think but not if the article body tries to paint a different, organic picture. This wasn't at all necessary for the content but simply there to detract from the paid promo nature of the piece.
"We don't want this getting out just yet, but... oh okay, you can come behind the curtain just this once."
<end of private demo>
"So, what'dya think? By the way, here's a fresh marketing video we happen to have lying around. Tell everyone. No no, actually, let us pay you to tell everyone. We insist."
Sneaky disclosures kill any interest I have in a product/company.
Case in point, I'm not even going to read this article after reading this comment.
>Robert Bosch LLC provided travel support for us to attend CES 2020. Bosch Sensortec, responsible for the Smartglasses Light Drive, was not aware of nor involved with the travel support.
There are existing heads-up display products targeted at cyclists such as the Everysight Raptor and Garmin Varia Vision. However they aren't practical or comfortable for runners.
Ideally I'd like the smart glasses to have the following features: ANT+ Extended Display profile. 6 hour battery life. Lightweight with even weight distribution (not all on one side). Prescription lens compatible.
Practically I also can't imagine how they'll solve weight/battery/networking issues with the device particularly in athletic settings where you're literally putting the device through a constant earthquake.
Maybe we should stop wishing for some ridiculous convenience layer to be added to our lives and just look around during our runs like we've done for thousands of years.
Trouble is GPS isn't always reliable in heavily wooded areas, so there might need to be improvements in GPS before that is really viable.
GNSS accuracy is poor in forests, canyons, and dense urban environments due to line-of-sight obstructions and multipath reflections. The latest Navstar Block III satellites should improve that a little, but any real solution would require deploying additional satellites in geosynchronous orbit like the Japanese QZSS.
Unfortunately the state of localization isn't ready for passive (aka non-radiative) reliable sub meter localization over anything more than a few meters squared in a well lit indoor space.
Going to be a while likely before we get to "wear everyday" AR glasses with sub meter accurate camera position/localization.
Also I hate to be "that guy" whose stupid wristwatch is constantly beeping during group workouts and races.
https://www.formswim.com/
(I am in no way affiliated with this company)
I'd love to have it overlayed over the corner of my eye while I'm working out.
Deleted Comment
Dead Comment
[0]: https://www.bynorth.com/
When Bluetooth headsets became small enough to be inconspicuous I lived in a marginal neighborhood. When my wife and I would go out, we'd play a guessing game called "Bluetooth or mental illness."
I think a lot of people don't realize that a lot of people consider it a luxury to be out of contact for periods of time.
I flush repeatedly, turn on the water, and take out my phone and have very loud imaginary phone calls, or just cycle through the available ringtones at high volume.
Some people are still so low class that it doesn't even phase them.
To that end, my experience of using transparency mode on AirPod Pro earbuds is that they very much do “become invisible” while allowing me to overlay (auditory) information on the world around me. If they were built to be as inconspicuous as my father’s hearing aids, nobody would know the difference and no overt social stigma would persist. The AR future of today is auditory.
EDIT: where "visibility" also includes haptic feedback and all of that other good stuff. See http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...
That image reminds me of the transhumanist YouTube series H+. In one of the first episodes one of the characters keeps trying to “reboot” after his implant succumbs to a computer virus capable of killing people.
Basically, you wore this thing like eyeglasses on your head, and it had a small arm that extended in front of your eye (but a little below it). When you looked down, it appeared like a computer monitor was hovering in front of you. The very early models were 320x200 resolution in monochrome (red on black), but I tried one at a trade show in 2000 that I think was 800x600 in VGA color, which at the time was pretty decent. I'm surprised these never got more popular; they would be great for laptop computers: you could have total privacy in your viewing (unlike a normal screen), and with improvements in the technology you could potentially "see" a much larger screen than a normal laptop has.
Does anyone else remember these?
Resolution was very low, battery life very short, UI very annoying, appearance very embarrassing.
I did not see any part of the article address these issues.
Our eyes are designed to let light in. The power level of lasers that can run all day on a 350mAh battery they share with some electronics is going to be miniscule, like maybe a fraction of a milliwatt of power. A common laser diode available right now that outputs 1mW continuously consumes about 36 mA of power to do that. Three lasers outputting 1mW would consume roughly 363 or 108mA, so a 350mA battery would only power those three lasers for about edit 3.4 hours, with nothing left for the electronics to use.
Less than 1mW is not much, even for a laser, and at visible wavelengths it's not going to transfer enough heat to even measurably change the temperature of the human eye, much less damage it.
Lots of interesting safety questions unique to this particular display technology.
It's not possible to "hack" more output power into lasers with software changes. Would that it was. You can change the duration of the beam, but you can't pulse the beam without a Q switch in a way that changes the instantaneous power.
I'm wondering how is the laser scanned over the retina (probably some MEMS chip). What happens if the beam scanning is suddenly stopped in place, wouldn't you get the whole laser power concentrated in a small dot. And it happens so quickly that you won't have time to react.
There may also be some phantom image effects like we used to have on those old TV monitors. Or like we can have when fixating a picture.
What's probably more insidious would be projecting some very lightly superimposed structure. You probably can induce some unconscious cognitive load or nausea. By decreasing the discomfort for example when the user is looking at an ad, you will increase the effectiveness of the ad.
But also for maintenance crews. Want to know which machine broke down? The glasses will give you directions and will even give you an overview of the maintenance history.
I believe this is not a consumer product. Bosch has some consumer products but they are way bigger in the business market.
And about the laser: it's just light. A laser doesn't mean 'cut through everything'. It all depends on the power. I'm sure Bosch doesn't want to melt your retina.
Yeah, in this context laser means a light source with extremely tight cone, meaning it can render a very tiny point on the retina. Not a ray of death that will penetrate your brain and come out the other side of the skull ;)
Please. I'll take 3.
Anecdotally, the only context I would want information beamed right into my line of sight is in work scenarios. All other scenarios I want technology to be in the background as much as possible.
Possibly even "real-world closed captions".
A tap for clock/calendar function would be handy.
Morse code (or other silent, maybe subvocal?) "telepathy" would be interesting as well.
I don't know how convenient or awful these would be in reality, but if the cost were not exorbitant and you weren't locked into a proprietary app ecosphere then it definitely seems like it would be worth a shot.
(However... since I often see this sorta misinterpreted in the wild on the internet, note that is an if-then statement. If the antecedent is false, I make no claim.)
The primary safety concern I have would be met by designing the lasers such that if they are overdriven for any reason, they will physically burn out before outputting enough light to be dangerous. Per the classic Therac-25 [1] case study though, that is one safety feature I absolutely want in hardware. There is no amount of software I would accept to implement that.
I would also additionally stick some fuses into the system, tuned below the threshold where the power will burn out the laser, along with of course building the whole battery system to not be able to deliver enough power to power the lasers to a dangerous level. However, I really want excess power to physically burn out the lasers. (I wouldn't want to find out the hard way that an EMP of some sort can overdrive the lasers.)
For all that I'm laying out safety systems here, I am quite confident that it could be done safely. We trust our lives to much more dangerous systems all the time. I will say that I can't explain to you how you'd audit that safety, though.
[1]: https://www.bowdoin.edu/~allen/courses/cs260/readings/therac...
True, but we usually get a pretty wide spread of light energies. These are likely going to be very specific frequencies, hitting similar areas over and over. I wonder if the retina can get fatigued of specific frequencies.
That part is not likely. If you concentrate the light over just a few retina cells, dangerous levels are very low.
I thought they kinda did. By being very similar wavelength and power, compared to normal light which has all sorts of wave lengths and power. This difference can mean certain rods/cones being stressed more than average and not triggering the sort of fatigue that normal light would cause.
Perhaps the best example of a similar concept, though not with lasers, is looking at a total solar eclipse right before or after the sun is fully eclipsed. There is a small period of time where extremely bright light makes it into our eyes, but not he frequencies that cause pain normally associated with looking at the sun. This means that our default defenses against looking at the sun don't kick in and doing permanent eye damage is extremely easy without feeling any pain as the damage is done.
The consequence of being a point light source is that lenses can refocus the beam, parallel or not, back into a point. Doing so concentrates a lot of energy on a tiny surface. And if that surface is your retina, then that when it becomes dangerous, literally burning a tiny hole into it.
It can never happen with, say, a regular light bulb, or even the sun. It the light source is spread out, the image on your retina, or anywhere else, will be spread out, limiting the energy density. This is an indirect consequence of the second law of thermodynamics called the conservation of etendue.