You can learn quite a lot about aviation UI experience from civilian airliners' instruments. I did work a little bit with Boeing-777X ones.
It is not great, actually. PFD (primary flight displays) are cluttered and information-noisy. HUD is a much better tool for flying experience. Fortunately for civilian pilots, this is becoming more common now.
FMS (flight management system) has all the usability of IBM mainframes from 1960s and about the same performance.
The tasks that pilot have to do before the flight are rather simple: you have to input weather conditions, aircraft load and waypoints for autopilot. But with even the most modern FMS it is a tedious and frustrating process, you have non-intuitive control flow and non-qwerty keyboard. Also all text-based, non-graphical interface.
The better parts of aicraft UI are EICAS/ECAM (engine information and alerts) - they are both useful and intuitive to understand, with emphasis on graphical indication.
A lot of hard to use bits are not from any technology limitations - modern aircraft displays are rather capable, but from the decades of industry legacy and expensive certifications required for any change.
Ask any pilot what the most complex part is about switching aircrafts and getting a new type rating. It's never the way the plane handles, is always either remembering the mandatory systems knowledge or how to use the FMS.
I flew along on a small private jet recently, their system was 10x easier to understand and more capable than the typical airliner... So it is possible, probably even at a lower cost.
This seems to be an almost universal problem at old, large engineering companies. Somewhere along the way (probably as they slowly ossify into bureaucracies) all passion for innovation slowly withers.
I don't know what the answer is but attitudes generally flow down from the top, so I'd imagine a good start would be to somehow re-install creative technical people in the top-end leadership roles.
I recently watched a few videos of flights in the Cirrus Visionjet, literally the smallest and lowest cost single engine private jet aircraft on the market. Its electronic cockpit looks very straightforward and logical in design.
At least as far what I've gathered reading airliner crash reports go (these obviously are going to be skewed) one of the challenges seems to be just being up to date / adjusting from one airliner to the next as far as procedure, and recovery processes for a given aircraft.
Being behind the aircraft as far as interpreting the situation/ indicators and not knowing what recovery steps to take seem to be a reoccurring problem, at least as far as catastrophic type situations.
You say information noisy, but some might WANT it to be information-noisy so that they don't have to press buttons to find what information they need.
I'm not saying they do (I'm a software developer Jim, not a pilot), just something to keep in mind. I'm rebuilding a webapp myself with more modern design, and in an early demo I've already gotten a remark about information density.
I would consider this a positive. Graphics do not imply improved readability, the information is dense and you have no choice but to present it as concisely as possible. You also need a dim cockpit for low light operations.
What could be a win is integration with the QRG to make access to information easier during a abnormal operating condition. You spend precious seconds locating and flipping through paper books to identify mitigation procedures.
It’s infuriating trying to use the touch screen in my wife’s minivan to change the radio station. I can’t imagine the frustration trying to use a touch screen to fly a fighter jet.
The G3000 integrated flight deck is pretty much the default in new business jets too.
In my opinion the touch screens are a lot easier to use than the old G1000 screens except for when you are getting tossed around in turbulence. Fortunately, there are still physical buttons for all of the "safety critical" or "time critical" inputs such as configuring the autopilot, changing radio frequencies, or adjusting the barometric pressure for the altimeter.
Source: I am a private pilot and have used the G1000, G300, and G3X touch.
I've always wanted to build a big rig of buttons to surround my pc monitor that I could program for various functions I do every day.
When working on Bioshock I had a strip of buttons that could be programmed to send keystrokes like it was a keyboard. I was working in Unreal Engine on the Xbox 360. I could plug these keys it into the dev kit, and with one click could enter all kinds of obscure console commands I could never remember. Was great.
My company builds and works with industrial automation, custom CNC machines, and industrial robots. I've observed a shift from purely button-and-neon operated panels from early PLC or relay logic machines, to multifunction keys on the human-machine interface display (HMI) before touchscreens, to a fitful few years when people thought it was a good idea to build machines with VB6 and various serial to digital IO adapters, to purely touchscreen-driven machines with one legally-mandated physical emergency stop button (about when I arrived in the industry, the other machines were mostly historic beasts I've occasionally been charged with maintaining), and back towards multifunction keys. There are usually a few dedicated buttons and indicators for common operations (reset, cycle start, control power on, feed hold, feed rate override, etc), that still makes sense most of the time.
I really think multifunction keys are the best of both worlds. As the author of this piece describes, the multifunction display with 20 keys around the outside (or, for CNCs, 10 keys across the bottom of the monitor and 10 off to the right, in a 1-4-4-1 spacing so you can feel exactly which button you're hovering over while you're staring unblinking at a cutter chewing through 5-figure assemblies) is a good compromise. It takes some serious concentrated planning to design a set of keys that are intuitive (top to scroll up, bottom to scroll down, one dedicated for enter, two for context-specific operations, etc), but it gives you the freedom to design relatively shallow but featureful menu systems that you can memorize and get tactile feedback to operate with confidence.
I have a StreamDeck[1] that I use for various random things. There's an open source driver for it too that you can drive from Python[2], letting you run Python code when a button is pressed. You can do some cool stuff with it (eg I've used mine to make HTTP requests on button press)
I've got a StreamDeck XL mounted under my monitor, in a button box; its capabilities as a macro machine are vastly underrated, although a few people are catching on that it's not just for live content creation.
It's absolutely amazing when doing remote work to be able to fire up OBS Studio, the meeting join, microphone control etc in a single click - and I have that just on one page, along with other meeting controls such as camera scenes. Another for VS Studio, there's integration with that which is quite useful to send commands directly. I have all my most of ten work websites on buttons of their own.
As mentioned, I also have button boxes however I'm still working out what the best way of sending commands would be; Joy2Key seem to mess up alt keys sometimes, making it sticky. When it worked it was nice, imagine adjusting text size just twisting a rotary for example.
This is why the macbook touchbar is so awesome. I know it got a lot of hate when it first launched, but I use it all the time and have with all sorts of customizations.
> I can’t imagine the frustration trying to use a touch screen to fly a fighter jet.
A war plane is something you have to operate while it's burning, or while you're bleeding on it, or while you can't see properly because it's full of smoke or someone just blinded you with a laser. The adoption of touch screens in this sort of cockpit seems misguided. Particularly for anything related to controlling comms or navigation.
That’s really not how we design modern jet fighters. Air combat involves a huge number of tradeoffs and ejecting is now the correct response to a wide range of issues. For example, the F-35 so engine so engine failure is likely to result in a lost aircraft.
It’s basically been decided that we are going to spend silly money keeping a small number of absolutely cutting edge aircraft flying rather than thousands if not tens of thousands of of likely more efficient but less capable possibly drone aircraft.
PS: To be clear it’s possible their making the correct choice. I personally doubt it, but I don’t have access to the kind of classified documents to justify things in one way or another. An effective labor weapon for example might render vastly cheaper drone fleets ineffective.
There are lots of buttons and switches on the throttle and stick. Most modern western fighters are designed for HOTAS (hands on throttle and stick) operation, where the pilot never takes their hands off the throttle and stick during combat.
> Something that has been lost in all glass cockpits is the tactile feel of pressing buttons and knowing you got a response – I found you could enter Lat/Longs by feel whilst looking out the window. This is something you definitely can’t do ‘on the glass’ on current jets.
I also really miss this from the dumbphone era. I used to be able to type SMS texts by feel using a T9 keypad while walking down the street, and only occasionally have to check the screen to see what I wrote. Smartphone QWERTY makes this impossible.
Same for cars interfaces. My 17-years old BMW has 0 big touchscreens, for everything there is a manual knob/button. I've found out I don't need more. I tweak most of stuff I need like AC and radio while driving without losing contact with the road even for a split of second.
I know from my experience that I took my sight off the road, there could be an atomic blast in front of me and I would hardly notice it. Somebody slamming brakes in front of me might be peripherally noticed but without context could cause a stupid and dangerous reaction.
When Tesla started basically putting tablets instead of knobs some folks cheered. I couldn't be more sad for this trend. Clearly worse direction for those like me.
Or even worse: little touchscreens that resemble a physical knob/button, even with haptic feedback!
All physical buttons/switches in my 20 year old Mitsubishi still work. I have a feeling the newfangled, fancy LCD-switches don't last that long, and will be very expensive to fix when they break.
> At present I am pressing the wrong part of the screen about 20% of the time in flight due to either mis-identification, or more commonly by my finger getting jostled around in turbulence or under G.
I'm surprised there isn't more hand rests used in these applications, similar to how the Dragon capsule has those finger "shelves" for the touch panels. Would make accurate touch input at least slightly easier
Hand rests are only half the problem--even with a totally steady finger, it's tough to consistently find a small area on a touchscreen (and that's even ignoring the level of customization the F-35 allows) without looking at it. The author wasn't joking when he talked about inputting coordinates without looking at the control panel, and I expect it to take some massive advancements in haptics before we can do anything similar with touchscreens.
Yeah, this seems like a fairly easy workaround. A possible limitation is the cramped cockpit may literally not have enough room, your knees might hit the rests when working the rudder.
Maybe if it were a flip-down thing where you could rest your hand when needed, but also bounce it out of the way when its not needed.
> think how much easier it is to type on a smartphone with your thumbs versus trying to stab at a virtual keyboard on a large tablet with just your index finger.
My experience typing on a touch screen on bumpy roads or even a seemingly smooth highway at normal speed (as passenger of course) is that I miss a lot of keys, even if I grab my phone with two hands and I type with thumbs, which are a little too wide. Luckily I learned to swype and that works much better because of the constant contact with the screen. Fingers can't jump away. Still, getting shifted characters right is a problem. A physical keyboard is much better in that environment.
This is way more sparse and streamlined than I thought. Wonder how big the manual for flight simulator game would be. Some video of UI in action on simulator.
Are there any design documentations for UX/UI for military hardware? Is there a name for this style. Is it actually as functional as it looks? I wonder how these designers feel about dealing with military powerpoints.
As an intern 25+ years ago I had an assignment to draw up mocks for a new UH-1N interface based on pilot feedback. There were a lot of rules (I think all under the umbrella of "MIL-SPEC"), but I think the one that sunk in the most was that the design had to work in monochrome. Depending on shades and colors breaks the design for the color-blind but also ruins interactions in bad lighting conditions, wearing helmets, etc.
It was a bigger, bolder look than was common on desktop computers, and I've always personally loved the style. There was a brief rise of similar rules in the civilian world as things moved mobile and we had to work on small screens with big fingers, but the direction now is to try to remove anything that might hint of interaction from the screen entirely. (Not a fan of this myself!)
I've been in civilian work pretty much ever since so I have no idea how things have changed since then. Edit: on second glance I think a lot of the same principles are still in effect. While colorful I'm not sure that anything depends on reading the color, and seen from the pilot's chair it probably isn't all that crowded.
tangentially, your point about meaningful shades/colors breaking the design for non-color-blind people situationally is a great example of the need to always keep accessibility in mind
There are many design guidelines and studies for A&D (aerospace and defense); everything from which typeface to use, to how to display quantities and directions. I am not away of any comprehensive style guide or all-encompassing name for it.
> Wonder how big the manual for flight simulator game would be.
One manual for the F-14B is about 1k pages, so I would assume this would clock in significantly more than that due to the added capabilities (in information at least, possibly not pages due to the F-14 being pre-HOTAS and things like that
http://server.3rd-wing.net/public/Ked/natops%20F14B.pdf
Wow, how did the fact that the G-forces will make it difficult to use voice commands or to interact with a touch screen not kill these F35 cockpit features a long time ago?
To be really specific for those not aware, pilots need to do special breathing routines when under high g-forces.
Deep, rapid breaths. They need to suck oxygen into their body as quickly and efficiently as possible to maintain consciousness, because the g-forces forces blood away from their heads and towards their legs and feet.
Here's a "classic" video of a pilot successfully evading an unbelievable number of SAMs over Iraq. The rapid breathing might make you think he's panicking. Nope, he's got it 100% under control and is following training perfectly. Though I'm sure he certainly needed a drink after making it home.
There's also sort of a method of flexing their thighs that they learn. Restricts blood flow to the legs, so there's more blood for the rest of the body. That's a bit of an athletic endeavor in and of itself. Try flexing your thighs... now hold them that way for 5, 10, 20 minutes. Yikes.
Anyway, how the hell could you even bark out voice commands while doing that sort of breathing!?!?
These voice commands are usually just for changing radio settings and other non-essential functionality. Stuff that would be just as difficult to change with tactile switches in high G. The Eurofighter has been using a similar system since the 90's
First, I'm curious if they use a throat mike or something bone conducting.
Second, given how shitty voice interfaces are, I'm shocked they would use them at all. I assume the military funds a lot of EEG research- that would be ideal here.
According to the article the author knows of no pilots that use the voice interface.
Given how finicky they are I can't say I'm surprised. Maybe with modern pseudo-AI systems they could be reliable enough to depend on, but none of that is in military hardware specced out 20 years ago.
>I'm curious if they use a throat mike or something bone conducting.
No, and no. Just good, old-fashioned microphones and speakers. If you're lucky, you have a set of "communication ear plugs" wired up to your helmet so the speakers in your earcups aren't trying to compete with engine noise and your ear plugs.
I only know some standard "anti-G" actions involve breathing really hard while keeping your lungs inflated. I guess that makes speaking more difficult?
I play a lot of Digital Combat Simulator, which models a lot of these planes. At first I was interested in shooting things, then I was interested in the flying, but in the end I learned I was most interested in the computer systems and how these games are mostly virtual simulations of old computers. It's fun to see what changed over the years and between cultures. The MiGs are completely different than their Western counterparts. How you navigate from point to point is often completely different. It's also interesting how durable the UX of MFDs were in cockpit design. When I went for my GA license, it was funny to see how many newer models of planes had moved to "glass" cockpits. It some ways, it felt like putting wifi on your fridge. Most of the GA planes folks fly were made in the 70s.
Agreed! If the information in this article is even slightly interesting to you, then you owe it to yourself to spend some time with Digital Combat Simulator. In there you can get first-hand experience with the Harrier and the F/A 18 discussed in the article. You will spend a lot of time with the cockpits and start to develop your own affinity for things in the cockpit, and develop your own ideas about what you like and what you don't.
Do a Google Images search for "DCS FA18 cockpit" or "DCS Harrier cockpit" and you'll see that the real life photos are pretty much indistinguishable from the screenshots, if you take the word "DCS" out of the search term. Every switch and gauge are faithfully simulated, and you can use the real-life operating manuals to operate these simulators.
There's a lot of overlap with a lot of the things I love about computer systems. It's fun to learn, and to operate systems, and to feel that feeling of mastery as you gain confidence in making the machines do your bidding. Those types of feelings are much the same (and different) between operating computers and operating aircraft systems. To a large extent these modern planes feels a lot like operating a big flying computer, at least in the simulators. This is also true of things like the Cessna G1000 glass cockpit in Flight Sim 2020.
One thing that's particularly impressive about the F/A-18 is that the Spec available in DCS (give or take a few features) is almost the same as the first batch from the 80s UI wise.
In 1983, serving alongside the almost completely analog/60s F-14, the F/A-18 had 3 pretty modern MFDs. Definitely more advanced than most Star Wars cockpits, for example.
I'm flabbergasted that MFDs haven't made more inroads in automotive and computer input device applications. We even have decent HUDs in cars now, but no MFDs. I'm convinced a good MFD would be far superior to the typical touchscreen crap - if Tesla had a MFD and a HUD instead of the giant TV screen, I would have bought one by now.
I think it's because the MFD model is designed for trained pilots who have to take a test to use them, i.e. I'd love one, but I can just imagine the complaints about the different pages and things like that.
Also, I'd love one for home automation too. There's only so much data you need to display or enter so the reduced latency of a dumb screen going straight into the back of a internet connected SBC (or even microcontroller) could be really nice to have in the kitchen, rather than having to fiddle around with either just a phone or what would end up being a tablet bolted to a wall.
The head-unit in my Ford Focus (2004-ish?) had an MFD. Parallax was a bit of an issue (which buttons were for which function looked different from passenger and driver seat). I rate it as just "okay." The one place it was clearly better than touchscreen was for selecting the radio-station from a list of "favorites." Each station was always on the same button, so switching was simple.
Note that this was for controlling the audio system only; this predates integrated "infotainment" center stacks.
> Having bashed the interface, the way this jet displays information to you is incredible. The sheer amount of situational awareness I gain from this aircraft and its displays is like nothing I’ve experienced before. The off-boresight helmet is much more accurate than legacy JHMCS systems and I find it clearer to read (although I still want a wide-angle HUD for flight and fight-critical data!). About the only thing missing from the whole cockpit is the lack of ‘feel’.
The prevailing narrative on HN and elsewhere (that seems to be a US DoD disinfo campaign) is that the JSF is an overpriced, overcomplicated piece of junk. Then there are these murmurs (largely from actual operators) leaking the fact that the aircraft is actually "incredible."
It is not great, actually. PFD (primary flight displays) are cluttered and information-noisy. HUD is a much better tool for flying experience. Fortunately for civilian pilots, this is becoming more common now.
FMS (flight management system) has all the usability of IBM mainframes from 1960s and about the same performance.
The tasks that pilot have to do before the flight are rather simple: you have to input weather conditions, aircraft load and waypoints for autopilot. But with even the most modern FMS it is a tedious and frustrating process, you have non-intuitive control flow and non-qwerty keyboard. Also all text-based, non-graphical interface.
The better parts of aicraft UI are EICAS/ECAM (engine information and alerts) - they are both useful and intuitive to understand, with emphasis on graphical indication.
A lot of hard to use bits are not from any technology limitations - modern aircraft displays are rather capable, but from the decades of industry legacy and expensive certifications required for any change.
I flew along on a small private jet recently, their system was 10x easier to understand and more capable than the typical airliner... So it is possible, probably even at a lower cost.
I don't know what the answer is but attitudes generally flow down from the top, so I'd imagine a good start would be to somehow re-install creative technical people in the top-end leadership roles.
https://www.youtube.com/watch?v=BIAO0mnQU7g
I recently watched a few videos of flights in the Cirrus Visionjet, literally the smallest and lowest cost single engine private jet aircraft on the market. Its electronic cockpit looks very straightforward and logical in design.
Being behind the aircraft as far as interpreting the situation/ indicators and not knowing what recovery steps to take seem to be a reoccurring problem, at least as far as catastrophic type situations.
I'm not saying they do (I'm a software developer Jim, not a pilot), just something to keep in mind. I'm rebuilding a webapp myself with more modern design, and in an early demo I've already gotten a remark about information density.
I would consider this a positive. Graphics do not imply improved readability, the information is dense and you have no choice but to present it as concisely as possible. You also need a dim cockpit for low light operations.
What could be a win is integration with the QRG to make access to information easier during a abnormal operating condition. You spend precious seconds locating and flipping through paper books to identify mitigation procedures.
Deleted Comment
More buttons please!
The G3000 integrated flight deck is pretty much the default in new business jets too.
In my opinion the touch screens are a lot easier to use than the old G1000 screens except for when you are getting tossed around in turbulence. Fortunately, there are still physical buttons for all of the "safety critical" or "time critical" inputs such as configuring the autopilot, changing radio frequencies, or adjusting the barometric pressure for the altimeter.
Source: I am a private pilot and have used the G1000, G300, and G3X touch.
When working on Bioshock I had a strip of buttons that could be programmed to send keystrokes like it was a keyboard. I was working in Unreal Engine on the Xbox 360. I could plug these keys it into the dev kit, and with one click could enter all kinds of obscure console commands I could never remember. Was great.
Update: was one of these. https://xkeys.com/
I really think multifunction keys are the best of both worlds. As the author of this piece describes, the multifunction display with 20 keys around the outside (or, for CNCs, 10 keys across the bottom of the monitor and 10 off to the right, in a 1-4-4-1 spacing so you can feel exactly which button you're hovering over while you're staring unblinking at a cutter chewing through 5-figure assemblies) is a good compromise. It takes some serious concentrated planning to design a set of keys that are intuitive (top to scroll up, bottom to scroll down, one dedicated for enter, two for context-specific operations, etc), but it gives you the freedom to design relatively shallow but featureful menu systems that you can memorize and get tactile feedback to operate with confidence.
[1] https://www.elgato.com/en/gaming/stream-deck
[2] https://github.com/abcminiuser/python-elgato-streamdeck
I've got a StreamDeck XL mounted under my monitor, in a button box; its capabilities as a macro machine are vastly underrated, although a few people are catching on that it's not just for live content creation.
It's absolutely amazing when doing remote work to be able to fire up OBS Studio, the meeting join, microphone control etc in a single click - and I have that just on one page, along with other meeting controls such as camera scenes. Another for VS Studio, there's integration with that which is quite useful to send commands directly. I have all my most of ten work websites on buttons of their own.
As mentioned, I also have button boxes however I'm still working out what the best way of sending commands would be; Joy2Key seem to mess up alt keys sometimes, making it sticky. When it worked it was nice, imagine adjusting text size just twisting a rotary for example.
http://www.hpmuseum.net/display_item.php?hw=116
I think it's called a keyboard ;)
A war plane is something you have to operate while it's burning, or while you're bleeding on it, or while you can't see properly because it's full of smoke or someone just blinded you with a laser. The adoption of touch screens in this sort of cockpit seems misguided. Particularly for anything related to controlling comms or navigation.
It’s basically been decided that we are going to spend silly money keeping a small number of absolutely cutting edge aircraft flying rather than thousands if not tens of thousands of of likely more efficient but less capable possibly drone aircraft.
PS: To be clear it’s possible their making the correct choice. I personally doubt it, but I don’t have access to the kind of classified documents to justify things in one way or another. An effective labor weapon for example might render vastly cheaper drone fleets ineffective.
Dead Comment
I also really miss this from the dumbphone era. I used to be able to type SMS texts by feel using a T9 keypad while walking down the street, and only occasionally have to check the screen to see what I wrote. Smartphone QWERTY makes this impossible.
I know from my experience that I took my sight off the road, there could be an atomic blast in front of me and I would hardly notice it. Somebody slamming brakes in front of me might be peripherally noticed but without context could cause a stupid and dangerous reaction.
When Tesla started basically putting tablets instead of knobs some folks cheered. I couldn't be more sad for this trend. Clearly worse direction for those like me.
All physical buttons/switches in my 20 year old Mitsubishi still work. I have a feeling the newfangled, fancy LCD-switches don't last that long, and will be very expensive to fix when they break.
I can absolutely type in my iPhone without looking at it. I just typed this comment with zero visual contact with my phone and no edits.
(EDIT: this is a weirdly divisive comment.)
Edit: the answer is no
That seems... bad, but also totally unsurprising.
Maybe if it were a flip-down thing where you could rest your hand when needed, but also bounce it out of the way when its not needed.
> think how much easier it is to type on a smartphone with your thumbs versus trying to stab at a virtual keyboard on a large tablet with just your index finger.
My experience typing on a touch screen on bumpy roads or even a seemingly smooth highway at normal speed (as passenger of course) is that I miss a lot of keys, even if I grab my phone with two hands and I type with thumbs, which are a little too wide. Luckily I learned to swype and that works much better because of the constant contact with the screen. Fingers can't jump away. Still, getting shifted characters right is a problem. A physical keyboard is much better in that environment.
This is way more sparse and streamlined than I thought. Wonder how big the manual for flight simulator game would be. Some video of UI in action on simulator.
https://www.youtube.com/watch?v=JSy8DcLaRDo
Are there any design documentations for UX/UI for military hardware? Is there a name for this style. Is it actually as functional as it looks? I wonder how these designers feel about dealing with military powerpoints.
It was a bigger, bolder look than was common on desktop computers, and I've always personally loved the style. There was a brief rise of similar rules in the civilian world as things moved mobile and we had to work on small screens with big fingers, but the direction now is to try to remove anything that might hint of interaction from the screen entirely. (Not a fan of this myself!)
I've been in civilian work pretty much ever since so I have no idea how things have changed since then. Edit: on second glance I think a lot of the same principles are still in effect. While colorful I'm not sure that anything depends on reading the color, and seen from the pilot's chair it probably isn't all that crowded.
Deep, rapid breaths. They need to suck oxygen into their body as quickly and efficiently as possible to maintain consciousness, because the g-forces forces blood away from their heads and towards their legs and feet.
Here's a "classic" video of a pilot successfully evading an unbelievable number of SAMs over Iraq. The rapid breathing might make you think he's panicking. Nope, he's got it 100% under control and is following training perfectly. Though I'm sure he certainly needed a drink after making it home.
https://www.youtube.com/watch?v=qUjX1RntqVw
There's also sort of a method of flexing their thighs that they learn. Restricts blood flow to the legs, so there's more blood for the rest of the body. That's a bit of an athletic endeavor in and of itself. Try flexing your thighs... now hold them that way for 5, 10, 20 minutes. Yikes.
Anyway, how the hell could you even bark out voice commands while doing that sort of breathing!?!?
Second, given how shitty voice interfaces are, I'm shocked they would use them at all. I assume the military funds a lot of EEG research- that would be ideal here.
Given how finicky they are I can't say I'm surprised. Maybe with modern pseudo-AI systems they could be reliable enough to depend on, but none of that is in military hardware specced out 20 years ago.
No, and no. Just good, old-fashioned microphones and speakers. If you're lucky, you have a set of "communication ear plugs" wired up to your helmet so the speakers in your earcups aren't trying to compete with engine noise and your ear plugs.
Deleted Comment
Do a Google Images search for "DCS FA18 cockpit" or "DCS Harrier cockpit" and you'll see that the real life photos are pretty much indistinguishable from the screenshots, if you take the word "DCS" out of the search term. Every switch and gauge are faithfully simulated, and you can use the real-life operating manuals to operate these simulators.
There's a lot of overlap with a lot of the things I love about computer systems. It's fun to learn, and to operate systems, and to feel that feeling of mastery as you gain confidence in making the machines do your bidding. Those types of feelings are much the same (and different) between operating computers and operating aircraft systems. To a large extent these modern planes feels a lot like operating a big flying computer, at least in the simulators. This is also true of things like the Cessna G1000 glass cockpit in Flight Sim 2020.
One thing that's particularly impressive about the F/A-18 is that the Spec available in DCS (give or take a few features) is almost the same as the first batch from the 80s UI wise.
In 1983, serving alongside the almost completely analog/60s F-14, the F/A-18 had 3 pretty modern MFDs. Definitely more advanced than most Star Wars cockpits, for example.
Also, I'd love one for home automation too. There's only so much data you need to display or enter so the reduced latency of a dumb screen going straight into the back of a internet connected SBC (or even microcontroller) could be really nice to have in the kitchen, rather than having to fiddle around with either just a phone or what would end up being a tablet bolted to a wall.
Note that this was for controlling the audio system only; this predates integrated "infotainment" center stacks.
> Having bashed the interface, the way this jet displays information to you is incredible. The sheer amount of situational awareness I gain from this aircraft and its displays is like nothing I’ve experienced before. The off-boresight helmet is much more accurate than legacy JHMCS systems and I find it clearer to read (although I still want a wide-angle HUD for flight and fight-critical data!). About the only thing missing from the whole cockpit is the lack of ‘feel’.
The prevailing narrative on HN and elsewhere (that seems to be a US DoD disinfo campaign) is that the JSF is an overpriced, overcomplicated piece of junk. Then there are these murmurs (largely from actual operators) leaking the fact that the aircraft is actually "incredible."