Basti's Scratchpad on the Internet
06 Apr 2025

The Perfect 23mm Lens

What's in a perfect lens? Is it sharpness? Contrast? Microcontrast? Bokeh? Whatever does that even mean?

Today is new lens day. To test my purchase, and get it out of my system, today, I will measure my new lens. It's the Fujifilm XF 23mm f/1.4 R LM WR, a preposterously complex lens for a ridiculously high-resolution camera. Will it prevail against my older Fujifilm XF 23mm f/1.4 R? How will it fare against my travel zoom, the Fujifilm XF 16-80mm f/4 R OIS WR?

First up, resolution: I printed a test chart on A3 paper. First I tried printing the PDF directly with Preview.app, but noticed that the chart itself started blurring lines at 1250 L/PH1, which implies that Preview.app prints at 1250/8.7in≈150DPI2. Not satisfied, I exported the test chart to a high-DPI tiff, and printed with Canon's print application, which appropriately resolved the finest lines on the test chart.

I took pictures of the marked 3:2 area on the chart. Since all my lenses are easily sharp enough to resolve the maximum 20 L/PH on the chart, I aligned the chart to the center quadrant of a 3x3 grid on my camera display, and cropped appropriately in post.

Test shots of a resolution chart in the center of the frame
Resolution chart of the image center, with resolution limit as red line in 300x L/PH, and microcontrast in ΔL

Each of these crops is at 1:1 resolution of the original capture, at the center of the frame, with no sharpening, straight from the raw file. Exposure was adjusted to a consistent 95% L in HSL. The red line indicates the level of resolution where the lines are just resolvable3. The resolution scale is in 300x L/PH. For instance, a resolution of 14 corresponds to 4200 L/PH, which is slightly short of the 5150 pixels in the 40 MP image. Since the color filter demosaicing is not entirely lossless, this is an excellent result.

On average, the new lens resolves a good 10% more lines than the old 23 prime, and perhaps 20% more than the zoom. In the real world, these numbers mean nothing. All three of these lenses are easily sharp enough for everything I might throw at them.

Next to the resolution is the contrast level in ΔL4 (HSL), measured at the right side of this pack of lines, which is a good proxy for contrast of high frequency detail, or microcontrast. A lower black level means more microcontrast. Indeed, this is where the new lens shows a clear improvement over the other two lenses, with visibly more contrast in fine details.

Also notable is a visible magenta color cast on the old 23mm prime wide open, and a minor blue color cast on the new 23mm. In normal images, color fringe suppression will mostly get rid of these, so these are not usually worrisome.

And then I repeated the procedure in the image corners. I took the same pictures, but this time the resolution chart covered the top-right quadrant of my 3x3 grid:

Test shots of a resolution chart in the corner of the frame
Resolution chart of the image corner, with resolution limit as red line in 300x L/PH, and microcontrast in ΔL

As expected, resolution and contrast fall off towards the image corners. For the new lens, the falloff is about 20% resolution, and 40% microcontrast at open apertures, lowering it roughly to the level of the old lens' center. For the old 23mm prime, it's about 40% resolution, and 40% contrast at open apertures, which is visibly soft. The effect diminishes rapidly for smaller apertures, however, and is mostly gone at f/4. Consequently, the 16-80 f/4 merely loses a small bit of contrast.

In summary of these objective measurements, the new lens is ridiculously good. In the real world, I would expect to see a minor difference in microcontrast, mostly visible in the image corners. But stop down, and all three lenses will perform very similarly.

As a sanity check, here's the central resolution target, shot a bit closer, to get a feeling for how the above measurements look like on larger structures:

Test shots of a low-resolution resolution chart in the center of the frame
Resolution target, zoomed out

As you can see, the differences in resolution are not a big deal in the real world, and even the contrast levels for larger subjects are not as different as the measurements might suggest. The color fringing wide open is still visible, however.

Next, let's look at the lenses' bokeh. For an objective measure, I took a picture of various film boxes along a measuring stick. Focus was at 50cm distance, with the close box at 25cm, and the far one at 75cm. This should give a good indication of the out-of-focus rendering on both sides of the focus plane.

Test shots of out-of-focus subjects, to judge bokeh
Bokeh evaluation 1: focus is at 50cm, close target at 25 cm, far target at 75 cm

You can see a slightly harsher bokeh and very minor color fringes in the old lens at f/1.4. But from these images, I don't see a compelling reason to choose one over the other. Curiously, at f/4, the 16-80 seems to show slightly less blur than the two primes.

For an assessment in more realistic conditions, I took a few photos of flowers and point lights. The flowers give a good indication of the focus transition from in-focus to out-of-focus, while the point lights show how specular highlights are rendered.

Test shots of real subjects, to judge bokeh
Bokeh evaluation 2: focus transition in a bed of flowers, and rendering of point lights

The transition pictures once more show a tendency for colored seams on the old prime, which is absent on the new one. This makes the new lens' bokeh seem a little smoother. Apart from that, the differences are very minor. As expected, there is no significant difference in detail between these shots. The point lights essentially confirm this, with a bit more color fringing on the old lens.

In conclusion, I found the new Fujifilm XF 23mm R LM WR slightly sharper than the old Fujifilm XF 23mm R, with noticeably less color fringing, and slightly better microcontrast. Once stopped down to f/4, these differences become insignificant. At that point, both prime lenses retain a very minor resolution advantage over the Fujifilm XF 16-80 R OIS WR.

Thus my curiosity in the technical quality of these lenses is satisfied, I verified that my lens is functioning properly, and I can now go out and shoot pictures.

Footnotes:

1

lines per picture height

2

the height of the measuring area was 8.7in (of the 11.7in paper)

3

as judged by my eyes

4

I calibrated the white level to L=95, and ΔL is the difference between the white level and the black level on these lines.

Tags: photography

The Leica Experience

In the world of photography, there is one brand that stands apart from the rest: Leica. It exudes an air of quality, refinement, and heritage. These cameras signal wealth and sophistication, and serve equally as cameras and as fashion pieces. Many of the greatest photographers in history shot their work on Leica. They are very expensive, too.

That cachet is strongest with their rangefinder series of cameras, which use an anachronistic manual focusing method that relies on optomechanical clockworks instead of digital trickery. Where you frame your shots through a plain window with overlayed framelines, but no digital display or distracting iconography of modern cameras. A reduction to the necessary, pure distilled photography, according to the myth.

It was thus with much anticipation that I took delivery of a Leica M240 with a Leica 35mm ƒ∕2 “Summicron” lens, loaned graciously by a friend from my local camera club. I have learned enough times to optimize for joy, not utility, so this could be a costly prospect if the experience should appeal to me.

The camera is indeed a beautiful object, exquisitely machined from solid metals. In an unexpected inversion of modern camera design, it sports a chunky body, but a small lens. A pleasant thing to hold, if a bit more heavy than I had expected. Operation of the camera is indeed pure simplicity. There's a shutter speed dial with an automatic setting, an aperture ring on the lens, and a focus tab. The latter two parameters offer no automation, contrary to modern cameras.

My Leica-using friends in the camera club predicted that I'd get used to the manual focusing quickly. And indeed I did. But precise focusing with a rangefinder is somewhat slow and only possible in the frame center. So where speed is required, I instead relied on good old zone focusing: pre-setting the expected distance on the lens, and stopping down a bit to get a deep zone of focus. Thus prepared to take a shot, there remains nothing but pressing the shutter button. This perfect simplicity of just being there, and worrying about nothing but the scene and the composition, feels unexpectedly freeing. On the flip side of course, framing with a window finder is imprecise, and getting accurate focus at large apertures is not easy or fast.

Another surprise were the rangefinder lenses. These are very much unlike modern designs, and focus on compactness and rendering over versatility. They are not sharp wide open, exhibit strong field curvature, and need to be stopped down to get sharp off-center. Annoyingly, the combination of field curvature and the centered rangefinder patch makes anything but center compositions impossible wide open. Still, this is actually a fair tradeoff for smaller lenses, and one I wish more modern lenses would offer, especially as it also tends to make for a smoother focus transition and bokeh.

Overall, I enjoyed the rangefinder style of shooting. However, the Leica M240 also annoy me a bit by taking too long to switch on. More than once, I raised it to my face, and it simply wasn't ready to shoot yet. It takes a good second to take the next shot, too. Even for a ten year old camera, that is annoyingly slow. I also very much missed a functional, movable back screen, to shoot from low or high angles without contorting myself into camera-yoga, and modern amenities such as USB charging and easy access to the SD card. I'm sure, however, that more modern models are faster at least, even if the back screen does remain fixed and there remains no sensor stabilization.

After two weeks with the camera, I had to return the Leica M240. This is a piece of camera gear I have long wanted to try. I felt the immediacy of the window finder, the simplicity of the rangefinder focusing, the heritage and cachet of the brand. These things were indeed beautiful. But it is still with some relief that I can conclude that this camera is ultimately not for me. I can respect anyone for whom this experience strengthens the connection with the process and images. But for me, I realized that I prefer a camera that goes out of my way, and the Leica is not that.

Tags: photography

Python Inception

At most companies I have worked for, there was some internal Python code base that relied on an old version of Python. But especially for data science, I'd often want to execute that code from an up-to-date Jupyter Notebook, to do some analysis on results.

When this happened last time, I decided to do something about it. Here's a Jupyter cell magic that executes the cell's code in a different Python, pipes out all of STDOUT and STDERR, and imports any newly created variables into the host Python. Use it like this:

%%py_magic /old/version/of/python
import this
truth = 42

When this cell executes, you will see the Zen of Python in your output, just as if you had import this in the host Python, and the variable truth will now be 42 in the host Python.

To get this magic, execute the following code in a preceding cell:

import subprocess
import sys
import pickle
import textwrap
from IPython.core.magic import needs_local_scope, register_cell_magic
 
@register_cell_magic
@needs_local_scope
def py_magic(line, cell, local_ns=None):
    proc = subprocess.Popen([line or 'python'],
                            stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE,
                            encoding='UTF8')
    # send a preamble to the client python, and remember all pre-existing local variable names:
    proc.stdin.write(textwrap.dedent("""
        import pickle as _pickle
        import types as _types
        _names_before = [k for k, v in locals().items()] + ['_f', '_names_before']
        try:
    """))
    # send the cell's contents, indented to run in the try:
    for line in cell.splitlines():
        proc.stdin.write("    " + line + "\n")  # indent!
    # send a postamble that pickles all new variables or thrown exceptions:
    proc.stdin.write(textwrap.dedent("""
        # save results to result.pickle
        except Exception as exc:
            with open('result.pickle', 'wb') as _f:
                _pickle.dump({'type':'error', 'value': exc}, _f)
        else:
            with open('result.pickle', 'wb') as _f:
                _values = {k:v for k, v in locals().items()
                               if not isinstance(v, _types.ModuleType) 
                                  and not k in _names_before}
                _safe_values = {}  # skip any unpickleable variables
                for k, v in _values.items():
                    try:
                        _pickle.dumps(v)
                    except Exception as _exc:
                        print(f'skipping dumping {k} because {_exc}')
                    else:
                        _safe_values[k] = v
                _pickle.dump({'type':'result', 'value': _safe_values}, _f)
        finally:
            quit()
    """))
    # print any captured stdout or stderr:
    stdout, stderr = proc.communicate()
    if stdout:
        print(stdout, file=sys.stdout)
    if stderr:
        print(stderr, file=sys.stderr)

    # load new local variables or throw error:
    try:
        with open('result.pickle', 'rb') as f:
            result = pickle.load(f)
        if result['type'] == 'error':
            raise result['value']
        elif result['type'] == 'result':
            for key, value in result['value'].items():
                try:
                    local_ns[key] = value
                except Exception as exc:
                    print(f"skipping loading {key} because {exc}")
    finally:
        pathlib.Path('result.pickle').unlink()  # remove temporary file
  
del py_magic  # otherwise the function overwrites the magic

I love how this sort of trickery is relatively easy in Python. Also, this is the first time I've used a try with except, else, and finally.

Tags: computers python

AI Predictions

Meta just invested 30 Billion Dollars into AI accelerators1. That's roughly equivalent to one Manhattan Project worth of money. Meta must expect a comparable return on that investment. But with that kind of money, that return must be disruptive.

And yet, AI does not feel disruptive. In my life, I have witnessed a few consumer technology disruptions: portable computers, portable telephones, internet-connected "smart" telephones, always-available GPS. Perhaps even tablet computers, smart watches, and electric cars? They all felt exciting! They all felt immediately obviously useful, if perhaps not always to me. But AI does not excite me. So if it's not for me, where is that $30B market that Meta is envisioning?

The best I can think of is a "command line for the common man". But the power of the command line comes from unflinchingly powerful commands, and behaving deterministically. Both of these characteristics are the antithesis of current AI technology.

We will not give an AI assistant the power to execute "format my hard drive", or "delete my Google account", even though a command line interface clearly would. Yet without that power, the AI assistant is toothless, and less useful. Even if we did, how could we trust the AI assistant to actually do what we want, and not misunderstand us? When interacting with LLMs, I am reminded of Terry Pratchett's gods of the Discworld, who wield absolute power, but you don't want to ask them for help as they're too likely to do what they think you wanted instead of what you actually asked.

But without power, and without deterministic behavior, you can't have a command line experience.

I keep coming back to that question: What is the disruptive use case of AI? Sure, we'll outsource some tasks to AI that we'd previously outsource overseas. This will result in rampant tragedy, and is already known to be difficult to pull off successfully. We'll enhance many tasks with short snippets of AI, to speed up menial programming tasks, and writing tasks, translation, image generation. But that's a feature-addition to existing software, not a disruption, let alone a $30B disruption.

Perhaps I'm wrong. Only time will tell. But I hereby predict that disruptive AI is a bubble.

Footnotes:

1

: For reference, one Billion Dollars is a 1km stack of $100 bills

Tags: computers

Books of 2023

Even though I did read some fiction last year, non really stuck with me. It appears that I am more interested in non-fiction these days. Strange how these things go.

Quest for Performance

book cover for Quest for Performance

Quest for Performance: The Evolution of Modern Aircraft, by Laurence K. Loftin

I have searched for a book like this for a long time: a history of airplane technology. The book details technological milestones and archetypes from the Wright flyer to the mid-1980s, with an emphasis on the two world wars and interwar years. It sometimes veers too close to a mere list of models and performances, but by and large still manages to tie it all into a comprehensible narrative. I guess you need to be a bit of an airplane nerd to appreciate this, but I found it fascinating!

And it is free to download, too.

The Soul of a New Machine

book cover for Soul of a New Machine

The Soul of a New Machine, by Tracy Kidder

The book retells the development of a computer during the interstitial years, after the big bang of computing in the first half of the century, but before the home computer revolution. This is a bit of a gap in the common computing lore, and one I hadn't know much about.

This happened before standardized CPU architectures, so we get a glimpse into CPU hardware design, the user-land software side of things, and the micro-code in between. This is quite an unusual perspective today, reliant on common abstractions as we are.

A fascinating read if you're interested in computing history, without requiring a Computer Science degree for the broader story.

Die großen Zeppeline

book cover for Die Großen Zeppeline

Die großen Zeppeline: Die Geschichte des Luftschiffbaus, by Peter Kleinheins

Half the book is reprints of technical reports of the original lead engineers who worked on the German Zeppelins. The other half is a retrospective view of Zeppelins in Germany and elsewhere.

There are myriad fascinating details about Zeppelin construction, like how their gas bags were made from animal intestines, or how they reclaimed water from engine exhaust to prevent losing weight while burning fuel. And it's especially fascinating to read about these things from people to whom this was the pinnacle of technology, and juxtapose our modern perspective.

This is another book I've been searching for many years. I found both this and Quest for Performance on Library Genesis, which is a terrific resource for researching books.