Tranquil Java : About | Jexer | Downloads | Screenshots | GitLab Project Page | SourceForge Project Page

Introduction

Let's be honest here, TJ looks like the Borland products: Turbo Pascal, Turbo C++, Turbo Debugger, and so on. Clearly this is not a coincidence. But why? And why now? Aren't TUIs dead dead dead?

The why is easy enough: Turbo Pascal and Turbo C++ were a game changer for the entire industry. They were bar none the easiest way to get from Hello World to a significant application for a long time. Computer science departments standardized on them, books were readily available, and a large pool of DOS-based libraries and applications were developed. Most of those applications that used standard C I/O (stdio.h) for user interaction were easily ported over to the Unix world.

Unfortunately, when the world moved to Windows the TUI applications built with the Turbo products were left behind and died with DOS. Borland even abandoned their Turbo Vision codebase, claiming it was in the public domain. (However, they failed to REALLY declare it free, so it remains to this day in copyright limbo until an owner of the copyright decides to truly free it....) It is not known why Turbo Vision was the only good text-only library for DOS that had full mouse support and draggable and resizable stackable windows, but it is apparent that when Borland let it languish in copyright limbo a whole generation of potential TUI applications were lost.

TUI's continued to stay dead for about twenty years. Why did they stay dead? Was it because no one wanted to have them? Or was it that no one was able to make them? I believe it was largely the latter: no one was available to make them, due to several reasons.

Reason 1: Licensing

First, the failure to legitimately license Turbo Vision as public domain or FOSS killed several years' worth of momentum on C and C++ platforms. By January 2000 it was known that Turbo Vision could not be shipped by Debian, the Free Vision library used by Free Pascal did not launch any competiting projects (presumably few people wanted to transliterate 40 kloc of Pascal back to C++), and much of the software world was busy on Java, C#, migrating off of Visual Basic, and growing into the "Web 2.0/3.0" HTML/CSS/Javascript ecosystem. By the time people started to notice the lack of TUIs on modern systems, the window to migrate such a system out of the DOS world had closed.

Reason 2: Ncurses

Second, the Unix curses standard provided a local minima that was very hard to get out of. People wanting to write new TUIs for Unix-like systems were pushed into the C language (or languages that had easy C integration) and (n)curses, which brought with it difficulties in a) porting to Windows (PDcurses and ncurses are different in non-trivial ways), b) Unicode output (which requires modifying every output call site, very painful for an existing application), c) full keyboard support (to see things like Ctrl-Home/Ctrl-End), and d) full mouse support (buttons are easy, motion is not, so dragging windows is hard). curses is very useful for getting colors and lines in text mode, but it also hamstrings the full feature set of Xterm. Getting curses to behave "like DOS" is a deep problem that not a lot of programmers have time to fully solve.

Reason 3: X-Based Terminals

Third, the Unix world needed time to settle on several standards that would enable Xterm to behave "like DOS". UTF-8 for encoding, Xterm as the terminal (rather than VT100), UTF-8 and SGR mouse coordinates to support mouse coordinates without corrupting the UTF-8 stream. These pieces did not come "out of the box" for the major distros and terminals until 2009-ish. They are also not part of the curses standard, so using them means either talking directly to an Xterm and not through curses, or using ncurses-specific extensions that assume it is talking to Xterm (and going this route does nothing to help the Windows users on PDcurses). Understanding Xterm well enough make it behave "like DOS" is another deep problem than not a lot of programmers have time to solve.

Reason 4: Terminal Widget

Finally, there was a missing piece that was critical if one wants to offer a good experience on Unix: a terminal emulator widget that is complete enough to "pass vttest." It turns out that no one will use a windowing toolkit on Unix if they cannot shell out to real terminal window, and no one will consider a new IDE if they cannot easily use their favorite existing editor with it. A terminal widget on Unix is thus much like OLE/ActiveX on Windows: it provides the glue that can integrate different UI tools into one common environment. A TUI system without a good terminal will always suffer poor adoption on Unix.

Conclusion

That the curses standard and Xterm evolution might have delayed the resurrection of TUIs was not too surprising. But that a terminal emulator widget is a hard requirement was an unexpected realization that only came to light after starting to write TJ. Yet it brought into sharp relief why TUIs have languished for 20 years: the path to a general-purpose TUI on Unix goes through programs like screen, tmux, and xterm rather than mc, ranger, or dialog. The pool of people who have cleared the vttest hurdle and can therefore provide the terminal, and also understand how curses works well enough to not need it and can therefore create a backend that speaks directly to Xterm, is very small indeed. (And the fraction of those people with interest in DOS-era TUIs and free time to recreate one is even smaller.)

But the pieces are here now. I would like to see TUI's make a comeback. They are blazing fast and slick as hell, so why not? The technical barriers to creating products that can far exceed the standard set by the Turbo applications are now cleared, so perhaps a few good applications can bring TUI's back into people's minds as worth writing new interesting things in.

Speaking of speed: I have already said TUIs are fast, but why are they so fast, even on Swing? Look at the math! A screen update in Swing for 80x25 cells on a 9x16 font at 32-bit color depth requires: 9x16x80x25x(32/8) = 1,152,000 bytes: naive drawing in Swing would require clipping checks and such against 1 MB per screen update. But a TUI cell is 2 bytes of attributes and say 4 bytes of character, so the frontend draw would need to do all its work against only 80x25x(2+4) = 12000 bytes: 4% of the number of bytes for a GUI, i.e. only 4% of the data to check for clipping, color change, and glyph change. A TUI can also cache the glyphs after they are drawn so that nearly all of the work is blitting rather than font rendering, and we all know blits are optimized to be memory-aligned as much as possible. And the TUI model filters mouse events down by roughly 90% by only generating them only when they cross a text cell boundary. So the frontend is 1/10 the data to process, the backend is 1/20 the data to process, and the rendering is who-knows-how-much-less thanks to the glyph cache. A TUI, even one using a GUI toolkit for I/O, can therefore do its job doing roughly two orders of magnitude less work than that same GUI toolkit could. There is value in building new TUI windowing models, even on systems that do not speak to Xterms!

And a special challenge to all the hip new languages out there: surely Java 1.6 of all languages can't have the best TUI-based IDE! Make yours, and show us all how you did it better! :-)

For the history of how I got here, see this.