PhD researcher in ML at the University of Tübingen seeking Research Scientist / Research Engineer roles. I work on robust and secure machine learning, with a focus on multimodal models, adversarial attacks and defenses, and AI safety. Publications include ICML and NeurIPS; I also have hands-on experience red-teaming frontier AI systems. Open to AI security, multimodal/generative ML, and strong research-oriented ML roles more broadly.
Because I practice them a lot and the dynamics of a real x-wind landing is not reproducible in a home sim. You can practice where to put the controls. But in a trainer like a 172 or DA20 you can’t practice the forces you feel pushing back at you and the other seat-of-pants aspects.
One of the drills I do is to descend to 50ft over the runway and then hold the plane on centerline in a eg 20kt crosswind. The sim just can’t even come close to how dynamic and effective that exercise is. Also the feeling of putting down one wheel first, feeling the bump, then the other. The slight skid and side g forces of you mess it up slightly. Those kinds of things.
There are many things that sims can’t reproduce that don’t matter that much when it comes to productive learning. But with cross wind landings, these things matter a lot.
I’m specifically referring to home simulators and MSFS and X-plane specifically. I have no experience in full motion commercial jet or military sims.
A few other posts mentioned that even the most sophisticated sim machines cannot simulate (de/)acceleration. Do you think "the forces you feel pushing back at you" here could mean deceleration?
It was an attempt to find the simplest possible mathematical system that was universal, i.e. could compute any function. It turns out that the answer is that function application by itself is universal (if functions are first-class entities). This was a surprise at the time, and it still generally surprises people today when they first learn of it.
There was a concerted effort in the late 19th, early 20th century (perhaps earlier too) to mechanise computation i.e., reducing it to a pure symbolic manipulation. There were obvious benefits, a famous one being Enigma Machine that was used to (successfully I think) break the German code during WW-2.
On the philosophical side a parallel and overlapping effort was going on to figure out if mathematics could represent all possible truths, again as symbolic manipulation system. Bertrand Russel's magnum opus Principia Mathematica [1] was one such famous work towards it. Kurt Gödel then made a breakthrough when he proved that such a system is impossible; i.e., a system can either capture all the truths or it can be consistent but not both[2]. Put differently, any mathematical system capable of representing all the truths will necessarily contain contradictions within it.
Now coming to your question.
Lambda calculus emerged in this milieu. Alonzo Church[3] invented one such system to mechanise computation, named ƛ-calculus. Using this system one can mechanically compute any function purely by symbolic manipulation. Later on Turing, Church's student I think, invented a totally different system named Turing Machine with the same purpose. Later on it was proved (by Church and Turing I think, but I'm not 100% sure) that ƛ-Calculus and Turing Machine are equivalent, Church-Turing thesis[4].
All these work, and more, laid the theoretical foundation for the modern computers. If we can today safely assume that computers are provably correct it's because of them.
Phil Wadler has an absolutely delightful talk where he takes us through a whirlwind tour of the history of the mathematical foundation of computers [5], highly recommended.
To give people an idea what regional trains means:
On regional trains you can travel Germany from north to south in less than 14 hours with 5 changes as opposed 8 hours with 2 changes for 125 EUR one-way on long-distance trains.
With the 49 EUR/month ticket you can hop off the train anytime you are tired of the journey. Can't do that with the 125 EUR one-time, one-way long-distance train ticket.
I totally see young and elderly people do this on regional trains.
Remote: Yes
Willing to relocate: Yes (Europe/UK; open to exceptional opportunities elsewhere)
Technologies: Python, PyTorch, NumPy, transformers, multimodal ML, LLMs/VLMs, adversarial robustness, AI security, red-teaming, research
Résumé/CV: https://chs20.github.io
Email: schlarmann.christian@gmail.com
PhD researcher in ML at the University of Tübingen seeking Research Scientist / Research Engineer roles. I work on robust and secure machine learning, with a focus on multimodal models, adversarial attacks and defenses, and AI safety. Publications include ICML and NeurIPS; I also have hands-on experience red-teaming frontier AI systems. Open to AI security, multimodal/generative ML, and strong research-oriented ML roles more broadly.