The Leica Experience
In the world of photography, there is one brand that stands apart from the rest: Leica. It exudes an air of quality, refinement, and heritage. These cameras signal wealth and sophistication, and serve equally as cameras and as fashion pieces. Many of the greatest photographers in history shot their work on Leica. They are very expensive, too.
That cachet is strongest with their rangefinder series of cameras, which use an anachronistic manual focusing method that relies on optomechanical clockworks instead of digital trickery. Where you frame your shots through a plain window with overlayed framelines, but no digital display or distracting iconography of modern cameras. A reduction to the necessary, pure distilled photography, according to the myth.
It was thus with much anticipation that I took delivery of a Leica M240 with a Leica 35mm ƒ∕2 “Summicron” lens, loaned graciously by a friend from my local camera club. I have learned enough times to optimize for joy, not utility, so this could be a costly prospect if the experience should appeal to me.
The camera is indeed a beautiful object, exquisitely machined from solid metals. In an unexpected inversion of modern camera design, it sports a chunky body, but a small lens. A pleasant thing to hold, if a bit more heavy than I had expected. Operation of the camera is indeed pure simplicity. There's a shutter speed dial with an automatic setting, an aperture ring on the lens, and a focus tab. The latter two parameters offer no automation, contrary to modern cameras.
My Leica-using friends in the camera club predicted that I'd get used to the manual focusing quickly. And indeed I did. But precise focusing with a rangefinder is somewhat slow and only possible in the frame center. So where speed is required, I instead relied on good old zone focusing: pre-setting the expected distance on the lens, and stopping down a bit to get a deep zone of focus. Thus prepared to take a shot, there remains nothing but pressing the shutter button. This perfect simplicity of just being there, and worrying about nothing but the scene and the composition, feels unexpectedly freeing. On the flip side of course, framing with a window finder is imprecise, and getting accurate focus at large apertures is not easy or fast.
Another surprise were the rangefinder lenses. These are very much unlike modern designs, and focus on compactness and rendering over versatility. They are not sharp wide open, exhibit strong field curvature, and need to be stopped down to get sharp off-center. Annoyingly, the combination of field curvature and the centered rangefinder patch makes anything but center compositions impossible wide open. Still, this is actually a fair tradeoff for smaller lenses, and one I wish more modern lenses would offer, especially as it also tends to make for a smoother focus transition and bokeh.
Overall, I enjoyed the rangefinder style of shooting. However, the Leica M240 also annoy me a bit by taking too long to switch on. More than once, I raised it to my face, and it simply wasn't ready to shoot yet. It takes a good second to take the next shot, too. Even for a ten year old camera, that is annoyingly slow. I also very much missed a functional, movable back screen, to shoot from low or high angles without contorting myself into camera-yoga, and modern amenities such as USB charging and easy access to the SD card. I'm sure, however, that more modern models are faster at least, even if the back screen does remain fixed and there remains no sensor stabilization.
After two weeks with the camera, I had to return the Leica M240. This is a piece of camera gear I have long wanted to try. I felt the immediacy of the window finder, the simplicity of the rangefinder focusing, the heritage and cachet of the brand. These things were indeed beautiful. But it is still with some relief that I can conclude that this camera is ultimately not for me. I can respect anyone for whom this experience strengthens the connection with the process and images. But for me, I realized that I prefer a camera that goes out of my way, and the Leica is not that.
Python Inception
At most companies I have worked for, there was some internal Python code base that relied on an old version of Python. But especially for data science, I'd often want to execute that code from an up-to-date Jupyter Notebook, to do some analysis on results.
When this happened last time, I decided to do something about it. Here's a Jupyter cell magic that executes the cell's code in a different Python, pipes out all of STDOUT and STDERR, and imports any newly created variables into the host Python. Use it like this:
%%py_magic /old/version/of/python import this truth = 42
When this cell executes, you will see the Zen of Python in your output, just as if you had import this
in the host Python, and the variable truth
will now be 42 in the host Python.
To get this magic, execute the following code in a preceding cell:
import subprocess import sys import pickle import textwrap from IPython.core.magic import needs_local_scope, register_cell_magic @register_cell_magic @needs_local_scope def py_magic(line, cell, local_ns=None): proc = subprocess.Popen([line or 'python'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding='UTF8') # send a preamble to the client python, and remember all pre-existing local variable names: proc.stdin.write(textwrap.dedent(""" import pickle as _pickle import types as _types _names_before = [k for k, v in locals().items()] + ['_f', '_names_before'] try: """)) # send the cell's contents, indented to run in the try: for line in cell.splitlines(): proc.stdin.write(" " + line + "\n") # indent! # send a postamble that pickles all new variables or thrown exceptions: proc.stdin.write(textwrap.dedent(""" # save results to result.pickle except Exception as exc: with open('result.pickle', 'wb') as _f: _pickle.dump({'type':'error', 'value': exc}, _f) else: with open('result.pickle', 'wb') as _f: _values = {k:v for k, v in locals().items() if not isinstance(v, _types.ModuleType) and not k in _names_before} _safe_values = {} # skip any unpickleable variables for k, v in _values.items(): try: _pickle.dumps(v) except Exception as _exc: print(f'skipping dumping {k} because {_exc}') else: _safe_values[k] = v _pickle.dump({'type':'result', 'value': _safe_values}, _f) finally: quit() """)) # print any captured stdout or stderr: stdout, stderr = proc.communicate() if stdout: print(stdout, file=sys.stdout) if stderr: print(stderr, file=sys.stderr) # load new local variables or throw error: try: with open('result.pickle', 'rb') as f: result = pickle.load(f) if result['type'] == 'error': raise result['value'] elif result['type'] == 'result': for key, value in result['value'].items(): try: local_ns[key] = value except Exception as exc: print(f"skipping loading {key} because {exc}") finally: pathlib.Path('result.pickle').unlink() # remove temporary file del py_magic # otherwise the function overwrites the magic
I love how this sort of trickery is relatively easy in Python. Also, this is the first time I've used a try
with except
, else
, and finally
.
AI Predictions
Meta just invested 30 Billion Dollars into AI accelerators1. That's roughly equivalent to one Manhattan Project worth of money. Meta must expect a comparable return on that investment. But with that kind of money, that return must be disruptive.
And yet, AI does not feel disruptive. In my life, I have witnessed a few consumer technology disruptions: portable computers, portable telephones, internet-connected "smart" telephones, always-available GPS. Perhaps even tablet computers, smart watches, and electric cars? They all felt exciting! They all felt immediately obviously useful, if perhaps not always to me. But AI does not excite me. So if it's not for me, where is that $30B market that Meta is envisioning?
The best I can think of is a "command line for the common man". But the power of the command line comes from unflinchingly powerful commands, and behaving deterministically. Both of these characteristics are the antithesis of current AI technology.
We will not give an AI assistant the power to execute "format my hard drive", or "delete my Google account", even though a command line interface clearly would. Yet without that power, the AI assistant is toothless, and less useful. Even if we did, how could we trust the AI assistant to actually do what we want, and not misunderstand us? When interacting with LLMs, I am reminded of Terry Pratchett's gods of the Discworld, who wield absolute power, but you don't want to ask them for help as they're too likely to do what they think you wanted instead of what you actually asked.
But without power, and without deterministic behavior, you can't have a command line experience.
I keep coming back to that question: What is the disruptive use case of AI? Sure, we'll outsource some tasks to AI that we'd previously outsource overseas. This will result in rampant tragedy, and is already known to be difficult to pull off successfully. We'll enhance many tasks with short snippets of AI, to speed up menial programming tasks, and writing tasks, translation, image generation. But that's a feature-addition to existing software, not a disruption, let alone a $30B disruption.
Perhaps I'm wrong. Only time will tell. But I hereby predict that disruptive AI is a bubble.
Footnotes:
: For reference, one Billion Dollars is a 1km stack of $100 bills
Books of 2023
Even though I did read some fiction last year, non really stuck with me. It appears that I am more interested in non-fiction these days. Strange how these things go.
Quest for Performance

Quest for Performance: The Evolution of Modern Aircraft, by Laurence K. Loftin
I have searched for a book like this for a long time: a history of airplane technology. The book details technological milestones and archetypes from the Wright flyer to the mid-1980s, with an emphasis on the two world wars and interwar years. It sometimes veers too close to a mere list of models and performances, but by and large still manages to tie it all into a comprehensible narrative. I guess you need to be a bit of an airplane nerd to appreciate this, but I found it fascinating!
And it is free to download, too.
The Soul of a New Machine

The Soul of a New Machine, by Tracy Kidder
The book retells the development of a computer during the interstitial years, after the big bang of computing in the first half of the century, but before the home computer revolution. This is a bit of a gap in the common computing lore, and one I hadn't know much about.
This happened before standardized CPU architectures, so we get a glimpse into CPU hardware design, the user-land software side of things, and the micro-code in between. This is quite an unusual perspective today, reliant on common abstractions as we are.
A fascinating read if you're interested in computing history, without requiring a Computer Science degree for the broader story.
Die großen Zeppeline

Die großen Zeppeline: Die Geschichte des Luftschiffbaus, by Peter Kleinheins
Half the book is reprints of technical reports of the original lead engineers who worked on the German Zeppelins. The other half is a retrospective view of Zeppelins in Germany and elsewhere.
There are myriad fascinating details about Zeppelin construction, like how their gas bags were made from animal intestines, or how they reclaimed water from engine exhaust to prevent losing weight while burning fuel. And it's especially fascinating to read about these things from people to whom this was the pinnacle of technology, and juxtapose our modern perspective.
This is another book I've been searching for many years. I found both this and Quest for Performance on Library Genesis, which is a terrific resource for researching books.
🪦 Emacs 2011-2023
For the last dozen years, I have used Emacs as my text editor and development environment. But this era ended. In this post, I outline how I went from using Emacs as a cornerstone of my digital life, to abandoning it.
In an ironic twist of history, it was Visual Studio that drove me to Emacs in the first place, and what ultimately pulled me away from it: In 2011, I was working on the firmware of a digital mixing console. This was edited in Visual Studio, compiled with an embedded compiler software, and source-controlled with command-line Git. It was ultimately Emacs that allowed me to tie this hodgepodge of idiosyncratic C+1, Git, and the proprietary compiler into a somewhat sane development environment.
Over the years, my Emacs config grew, I learned Elisp, I published my own Emacs packages, and developed my own Emacs theme. I went back to university, did my PhD, worked both OSS and commercially, and almost all of this was done in Emacs. As particular standouts beyond traditional text editing, I used Emacs' Git-client Magit every single day, and my own org-journal was absolutely vital as my research/work journal.

In 2023, however, I started a new job, once again with a Visual Studio codebase. This time, however, the code base and build system was tightly woven into the Visual Studio IDE, and only really navigable and editable therein. It thus made no sense to edit this code in Emacs, so I didn't. Perhaps I also needed a break.
And as my Emacs usage waned, so its ancient keyboard shortcuts started to become a liability. I started mis-typing Emacs things in Visual Studio, and hitting Windows shortcuts in Emacs. Friction began to arise. At the same time, I started noticing how poorly Emacs runs on Windows. Startup takes many seconds, it does not integrate well into the task bar2, it doesn't handle resolution changes gracefully, and it's best I don't start talking about its horrendously broken mouse scrolling. And of course it can't scroll point out of the window3.
My last use-case for Emacs was org-journal. I ended up porting a basic version of it to Visual Studio Code. Having thus written a text editor plugin for both editors, I have to be blunt: both, the anachronistic bone-headedness of Elisp, and the utter insanity of TypeScript's node APIs, are terrible environments for writing plugins. A few years ago I did the same exercise in Sublime Text Python, which was a beautiful, simple, quick affair. But I do enjoy a programming puzzle, so here we are.
The final nail in Emacs' coffin came from an unexpected corner: For all my professional life, I was a solo coder. My Emacs was proudly black-and-white (different fonts instead of different colors!), and my keyboard shortcuts were idiosyncratically my own. I did not merely use Emacs. I had built MY OWN Emacs. I like to think this built character, and API design experience. But it was of course a complete non-starter for pair programming. After having tasted Visual Studio (± Code) Live Sharing, there was simply no going back.
And thus, I am saddened to see that I haven't started Emacs in several weeks. I guess this is goodbye. This blog is still rendered by Emacs, and I still maintain various Emacs modules. My journal is still written in org-mode. But it is now edited in Visual Studio Code.
Footnotes:
An eclectic subset of C++, intersected with the limitations of the embedded compiler. This was decidedly pre-"modern" C++, and probably less than the sum of its parts.
Usually, the program's taskbar button starts the program, and represents it while running. Emacs spawns a new button instead.
This is emacs-speak for "it can't scroll the cursor outside the viewport"

bastibe.de by Bastian Bechtold is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.