The Final Nail in the Coffin for Go

Published at 16:47 on 25 June 2024

The same Go program I had to fight with for two days to get to the point where it was still unfinished, but:

  • I had satisfied my curiosity that it was, indeed, possible to do the particular thing I was struggling with in Go, and
  • It had been painful enough to prove to me that I should not consider Go a language of choice.

… Has now been coded to the same stage of completion in C++. It took half the time, half the effort, and under half the lines of code that it did in Go.

And that is as an absolute novice C++ programmer, writing his second C++ program ever. I had been experimenting with Go for about a year before I recently gave up on it.

It’s not that C++ is good, mind you. It’s a total cruft fest that should have been put out of business by something more modern at least 15 years ago. But it’s still possible to do things in C++ without the language and/or standard library persistently getting in your way like they do in Go.

Go is so bad it is literally worse than C++.

Go: Fooled Me Twice, Shame on Me

Published at 09:32 on 21 June 2024

I just can’t seem to learn. I so much want there to be a better alternative (i.e. a good, modern programming language that compiles down to machine code) to C/C++ out there. And I just can’t stop thinking that Go might be it. Then I keep running into problems. Go (and its libraries) continually keep making it very difficult to do clever things.

My obstacle this time is how Go parses command-line arguments. Most modern languages do this by building a collection of objects to describe the allowed command-line syntax, making a parse call, and receiving in return a collection of objects describing the options and arguments found.

Go is different, probably because it was written by individuals skeptical of object-oriented programming. Instead, it is all based on passing pointers into an argument-parsing subsystem. When one initiates a parse, the pointed-to variables get set in ways reflecting what the user typed on the command line.

That might work well enough in the simple case, but my case is not so simple. I am trying to write a family of related commands, each with a set of standard arguments, and most with some custom, command-specific arguments as well. Moreover, there is a configuration file, and it is possible to get values from there if they are not specified on the command line with options.

In Java, I have done that by using the Apache Commons CLI library, subclassing the main class that holds the syntax description, and having it auto-populate itself with the standard arguments. Then my subcommands all use that class, and automagically get the standard options they need. No fuss, no muss, no repeated code, and all the options are in a single place.

Then I pull those parsed option values into the class representing a parsed configuration file, so that a command-line option will overwrite the in-memory copy of an in-file option. Presto! All the configurable values I need are now all in one place. And not only that, it was simple and easy to accomplish.

Go’s pointer-based argument parsing makes this basically impossible. Oh, there’s an alternate argument parsing library out there, but it is likewise broken by design, because it is pointer-based as well. Wait! There is a “value interface” that might offer an out? Nope, sorry, no escape: it only seems to support string values (even Boolean flags are unsupported!), and it is incompletely documented.

I keep running into this sort of crap with Go. And only with Go. Other modern languages just don’t seem to have this degree of pervasive brokenness. The Python standard library, for example, parses arguments much in the same way that Apache library does in Java. Even the hoary old cruft-fest of a programming language that is C++ has a popular third-party library that does the right thing.

It’s not just argument parsing. Previously, I struggled with how Go’s character set support can’t signal an explicit error condition when it encounters invalid input. Java, Python, Ruby, C++: all can do this if requested. Not Go, at least not out of the box and not without a lot of extra effort.

It’s bad enough to make me seriously question if there’s any problem space out there for which Go is the most appropriate solution. I know there are large, successful software systems written in Go, but my own personal experiences make me suspect strongly that those projects were much harder to get to their current state of completeness than if they had been written in some other, less limiting, less pervasively crippled by bad design, environment.

Perhaps it’s all just me, and others don’t feel the suck so much. Frankly, I don’t think so. But that’s just one more person. More damning, I think, is the verdict of Go’s original sponsor, Google. If Go really was the way to fill the need for a more modern language that compiles to machine code, then Google would not be sponsoring the Carbon project.

It’s all a shame, because to reiterate I really want there to be a better alternative to C/C++ out there. Alas, Go does not seem to be it.

Well, That Sure Sucked

Published at 15:46 on 17 June 2024

Yet again, a MySQL auto-update ran while I was out of town. Yet again, the auto-update auto-trashed my database.

In baseball, a hitter has three chances to hit the ball. But that is in a situation where another pitch takes under a minute to make. Life is not a baseball game. I just pissed away most of a day recovering from a MySQL failure. Again. Two strikes and you’re out, MySQL!

I am now back up, with all auto-updates disabled, and using MariaDB instead of MySQL.

Postscript

Published at 18:43 on 2 June 2024

It sucked even more than I expected it to. This is because in addition to all the issues I mentioned previously, the site is crap from a technological point of view: slow, laggy, and full of flimsy server-side Javascript that does things like randomly cause the contents of your coding window to disappear. Plus it doesn’t run at all unless I disable some of the protections on my browser.

At least I had fun sending the stupid thing arbitrary machine code. Which, of course, was harmless and only printed a message linking to the aforementioned article.

I highly doubt I will get selected for an actual interview, but why would I want to be, given what sort of garbage they feel comfortable shoving at people?

Why HackerRank Sucks

Published at 12:02 on 31 May 2024

Foreword

I have thought of writing this article more than once before. I just got assigned yet another HackerRank test as part of an interview process. Sometimes, I have blown such things off entirely. Sometimes, I attempt to do my best on answering them.

This time, I plan to do something different. I will have a link here in my response to the problem; if you are reading this as a result of following that link, then here is why I responded the way I did.

Introduction

HackerRank sucks for two main reasons:

  1. The general irrelevance of undergraduate computer science exercises to real-world programming, and
  2. The artificiality of HackerRank’s time constraints.

Irrelevance of Problems

In my experience, HackerRank’s exercises tend to be thinly-disguised rehashings of undergraduate computer science homework assignments. Such assignments tend to be heavily based on coding implementations of basic data structures, often trees of various sorts, that have very limited real-world relevance to solving problems on the job. I have written on this here before. It’s really not a surprise, given how the usage of such problems is basically mandated by the general nature of the constraints of the HackerRank platform.

Effectively, such tests screen for either recency of undergraduate coursework (and thus lack of practical, on-the-job experience), or willingness to spend time brushing up on a skill set whose real-world utility is extremely limited (and thus for unquestioning submission to authority and willingness to obey pointless rules). The first is diametrically opposed to the goal of filling a senior-level position, and the second is diametrically opposed to the sort of environment that I personally thrive in.

Artificiality of Time Constraints

Real-world programming tasks generally do not pop unexpectedly out of nowhere, with no advance warning. They are usually foreseeable as part of the natural trends of evolution of software systems and the organizations they support. As such, the prudent developer has usually already spent some time thinking about such issues. Sometimes, of course, they do pop up unannounced (long-latent bugs sometimes manifest, and sometimes have severe impact). Even then, they don’t come with a count-down timer and a hard artificial deadline. It is possible to take a walk (I get some of my best ideas outdoors) or to bounce ideas off colleagues.

With HackerRank, there is no such subtlety. The clock is ticking, the artificial deadline is rapidly approaching, and it will be enforced without mercy. (Sometimes you even get a so-called proctored test, in which you must enable a spy camera. Leaving the room is considered cheating. Lucky you.) You will solve the problem in one sitting, and you will do it now. And the problem has limited real-world relevance at best.

Conclusion

Maybe I will submit an actual solution, and maybe I will not. It will depend on the exercise and how I feel about it at the time I submit my response. At any rate, the link here will be the most important part of that response.

Really, there is not much more to say, so it is time to wrap this up. HackerRank sucks, and now you know why.

Take That, iPhone!

Published at 15:01 on 11 April 2024

After a recent update, my iPhone started letting me control my headphone volume from threshold of pain loud to insane instant deafness loud. I guess some aging hipster who ruined his hearing going to too many rock concerts without hearing protection got appointed to a QC position at Apple.

Based on what I had read about human sound perception, I guessed I needed about 6 dB of attenuation to tame the thing. A simple matter of adding four resistors to the picture (two for each channel, one in series to cut the voltage in half, and another in parallel to restore the impedance the audio amplifier sees to what used to be pre-attenuator).

The worst part about it was all the fiddly soldering (those connectors have some tiny terminals). But it works, and 6 dB was indeed the correct amount of attenuation needed to restore sanity to the device.

Spreadsheets Suck, Here’s Why

Published at 23:31 on 2 April 2024

It’s the math.

Specifically, they all (at least all the leading ones: Microsoft Excel, Apple Numbers, and Libre Office Calc) use floating point numbers and arithmetic.

Fractional radix digits are only capable of accurately representing numbers whose prime factors contain only factors of the radix. For the sort of base 10 numbers we are familiar with, this means that any fraction whose denominator can be represented as a product of 2’s and 5’s can be represented. As an example, 8 factors to 2³, so eighths can be represented with complete accuracy as decimal fractions. It takes three digits, of course, because 10³ (2³ ✕ 5³) is the lowest power of 10 that is an even multiple of 8, but you can do it. And really, three digits isn’t that bad.

If you use a denominator that cannot so be represented, then you get an infinitely-long repeating fractional part. The canonical example of this is ⅓ turning into 0.3333333….

But computers use base 2, not base 10, and this creates a problem. 2 is itself prime, so fractional radix digits in binary notation can only represent denominators that are powers of 2 accurately, and nothing else. Everything else turns into a number with an infinitely-long repeating fractional part.

This is a big problem, because one of the most common uses of spreadsheets is financial calculations, and floating point number can only represent monetary amounts as small as 25 cents accurately. If a financial quantity does not end in .00, .25, .50, or .75, your spreadsheet is representing it wrong! Only slightly wrong, of course, but still wrong. And if you are adding and subtracting enough numbers together, eventually the result will be wrong by a penny or two.

It is for this reasons that banks use decimal arithmetic, not a processor’s built-in floating point arithmetic, for their financial calculations. They don’t want their customers’ balances to drift from reality by a few pennies per year. Banks have done this since just about forever. COBOL, one of the oldest high-level programming languages out there, and designed for business computing, uses decimal arithmetic by default, and this is why.

The rationale for spreadsheets not doing likewise is for “performance” reasons, but frankly, that is a load of horse hockey. Yes, built-in floating point calculations are faster. But the performance hit from using decimal arithmetic is far from a deal-killer. COBOL dates from around 1960, when computers had only a tiny fraction of the computing power they do today, yet COBOL programs ran just fine way back then, and cranked out accurate results without gratuitous rounding errors. (Plus, your average spreadsheet is a lot smaller than your average batch of bank transactions to process.)

I was going to make more use of spreadsheets in figuring my income taxes this year, but after learning the above I am mostly sticking with good old dc, which uses decimal arithmetic. (Actually, it uses base 100, but when it comes to avoiding rounding errors, base 100 works identically to base 10, since the latter is a power of the former.)

Why Swift Is Not My Favourite Programming Language

Published at 22:44 on 7 February 2024

It’s the libraries, stupid.

The standard Swift library is laughably in-comprehensive. Things you can do in the standard libraries for Java, Python, PHP, C#, Ruby, and most other common modern programming languages just aren’t in there.

What you are supposed to do, from what I gather, is to use the Apple Foundation framework. There are several problems with that:

  • The framework is a hot mess. It got its start back in the 1990’s as part of the NeXT operating system, and has been incrementally hacked on ever since. The documentation is likewise a mess: incomplete, cryptic, and poorly-organized. It is fully part of the pattern that Apple products tend to be as programmer-hostile as they are user-friendly.
  • The framework is still incomplete. Support for tasks as basic as doing buffered reads from an arbitrary text file on a line-by-line basis are absent from it. (At least I think they are absent; review the part about the documentation being a hot mess above.)
  • The framework is an Apple-only thing. There is an ongoing effort to open-source the Foundation framework so that Swift programs can be more portable, but it is a work in progress.

The bottom line is that Swift is not, in fact, the general-purpose programming language it is claimed to be. Unless one is writing native-mode GUI applications for Apple products, Swift really doesn’t make much sense.

It’s a shame, as the core Swift language looks to be fairly well-designed. It could be a great general-purpose programming language if only it came with a decent standard library. Alas, that’s a bit like saying Mr. and Mrs. Lincoln could have had an enjoyable evening at Ford’s Theatre if only that hadn’t happened.

I may eventually delve into Swift for such purposes, but as things currently stand, programming in Kotlin with the Java Swing platform allows me to develop GUI tools that run on my Mac, and I don’t have to deal with all the ugliness that is Apple’s native programming environment. Swing isn’t perfect, and its rough edges sometimes manifest, but it’s been good enough for my personal use.

XeTeX Redux

Published at 23:22 on 27 October 2023

It is a known bug. The workaround is to use fontspec to invoke the feature manually, e.g.:

\fontspec{Baskerville}[Renderer=OpenType, RawFeature={+smcp;-liga}]

Ligature substitution should be disabled when using small caps because the two features tend to be incompatible.

Python Set for World Domination?

Published at 18:03 on 22 October 2023

Python is already sitting at the top of the TIOBE Index of most popular programming languages, and has been for some time. And no wonder: it’s one of the best ones out there.

One big thing that stops it from being close to the best is that it has difficulty walking and chewing gum at the same time. In one project, this caused me no small amount of pain. It’s part of the reason I have used the Java virtual machine (usually via Kotlin, which is a more modern language than Java) on some of my projects.

Over the years, there have been numerous proposals to remove the global interpreter lock (GIL) from Python. These have generally gone nowhere, with the exception of the existence of Python versions that target the Java virtual machine and the .NET common language runtime*. There are a number of valid reasons for this.

But now, there is a very serious proposal to remove the GIL from the reference implementation, and the Python Steering Committee has indicated they will almost certainly accept it.

Once this happens, expect Python’s dominance to increase further.

* Alas, these implementations tend to lag (sometimes seriously) behind the reference implementation, plus they are not compatible with many of the third-party Python libraries out there. (The latter issue is also why it has been so difficult to remove the GIL, as doing so in ways that are both a) not ruinously inefficient and b) compatible with existing libraries has proven exceptionally difficult.)