Last month I sat in our training office at Simaero, watching an instructor try to pull a student's performance trend from three different systems, a paper progress report from 2020, a scanned PDF, and a digital progress report. He wanted to use an AI tool to identify weak areas before the next sim session. The AI gave him a confident, detailed analysis. It was also completely wrong, because half the data was missing and the other half was inconsistent. A very intelligent tool, working with very unintelligent information.

That moment is the entire aviation AI story right now, compressed into one frustrating afternoon.

What Is Actually Happening

Next month, Ryan and I are heading to the IATA World Data Symposium in Singapore. Over 700 aviation leaders from Airbus, Boeing, Singapore Airlines, OpenAI, IBM, SITA and more will gather to discuss data, AI, cybersecurity, and the digital infrastructure underpinning this industry.

We are going because this is where boardroom decisions get made that land on your flight deck six months later. Bryan Air's job is to translate those decisions before they arrive, to tell you what it actually means for the line pilot, the training captain, and the first officer looking for a new gig abroad.

But here is what I want you to understand before a single keynote is delivered: the conversation at WDS will be dominated by AI. And the dirty secret behind every AI conversation in aviation right now is that most organisations are not ready not because the technology isn't there, but because their data isn't.

Only 37% of airlines have successfully implemented unified data platforms capable of supporting advanced analytics. That is not my opinion; that is the industry's own assessment of where it stands. The rest are running on fragmented systems, legacy software, paper trails, and manual processes that no AI tool can make sense of.

The Upside

When the data is right, the possibilities are genuinely exciting. And I have seen this first-hand.

Through e-aerospace, I built a digital decision-making simulator a tool designed to sharpen risk management for airline pilots. The AI added value to the tool because we fed it clean, structured, digital data from real training outcomes. It worked because the foundation was solid.

That is the model. AI as an accelerant for good training design, not a replacement for it. Imagine your recurrent sim programme adapting to your individual performance profile not because a computer is guessing, but because your training data has been captured digitally, consistently, over years, and an AI can spot patterns you and your training captain might miss. Standby for the introduction of real CBTA, when done right and enhanced with AI the results are going to be impressive.

This is also career intelligence. The pilot who understands data-driven training who can engage with it rather than just endure it stands out. Whether you are preparing for command, interviewing at a new airline, or building resilience into a career that, as I know too well, can be disrupted overnight.

The Downside

Here is the part no one at a tech conference wants to talk about: garbage in, gospel out.

AI does not hedge. It does not say "I'm not sure about this because your records from 2022 are incomplete." It gives you a confident, articulate, well-structured answer and if the underlying data is wrong, incomplete, or inconsistent, that answer will be wrong, confidently. In a business context, that is expensive. In an aviation operations context, it could be catastrophic.

This is why so many companies are having a low success rate getting AI to deliver real value. They are trying to leapfrog straight to the AI layer without doing the foundational work: digitising their processes, cleaning their data, automating the collection flow. You cannot jump to AI. You need to walk: digitise, standardise, automate, collect and then AI becomes powerful. Skip the walk, and you are handing a Formula 1 car to someone who hasn't built the road yet.

And then there is the regulation question. EASA published its first-ever regulatory proposal on AI in aviation in November 2025; NPA 2025-07, an AI Trustworthiness Framework covering Level 1 (AI-based assistance) and Level 2 (human-AI teaming). It is genuinely impressive work, and it may well become the global reference standard.

So the technology is in a Formula 1 car. The regulation is on a bicycle. Both are heading in the same direction, but they are not running the same race. This creates a real tension: do we slow down and wait for the regulator, or do we push forward and hope the rules catch up? For safety-critical applications and I am thinking specifically about AI-assisted decision-making in the cockpit the answer matters enormously. I have been experimenting with a digital risk mitigation matrix and decision-making calculator designed to work offline in the flight deck. The tool exists. The question of whether a regulator will certify an AI-assisted captain's decision in the couple of years is an entirely different conversation and one Ryan and I will be pressing at WDS.

The Pilot's Toolkit

Here is your Kaizen step for this week. Before you touch any AI tool, I want you to do something simpler: audit your own data.

Open Claude, ChatGPT, or Perplexity whichever you have access to and paste in a set of your own training notes, study summaries, or career records. Ask it to identify gaps, inconsistencies, or areas where information is missing. You will be surprised at what comes back. If your personal data is messy, imagine what your airline's data looks like at scale.

Then take one step further: start capturing your notes digitally, in a consistent format, from today. A simple notes app with a standard template date, session type, areas reviewed, self-assessment, action items. Fifteen minutes after each session. You are building your own clean dataset. When the AI tools mature and they will you will be ready to use them, because your data will be ready too.

The Close

The aviation industry has always been exceptional at managing risk in the air. We brief threats, we run checklists, we monitor and cross-check. The question now is whether we can apply that same discipline to how we manage data on the ground because data is the raw material of every AI decision that will eventually reach your cockpit.

Ryan and I will be at IATA WDS in Singapore next month, asking the hard questions and translating the boardroom language into flight deck reality. If there is something specific you want us to dig into, reach out drop a comment, send a message, or catch us on the Bryan Air Podcast. We are 260+ episodes deep. We are not going anywhere.

Fly safe. Think smart.
Bryan

Keep Reading