Cutting the (Virtual) Cord

Published at 08:02 on 4 March 2016

I’ve been using Boingo for internet access on the ferry since moving to Bainbridge Island about three years ago.

It’s reasonably priced (about $10 a month), but it’s never been fast. Three years ago, it was easily good enough for reading email with IMAP and sometimes a bit of light web usage. (Pages that had lots of Flash or Javascript never worked well, but simpler ones often worked slow but passably.)

Since then, it’s gotten progressively slower. I long ago given up using a web browser on the ferry and relegated it to IMAP-only use. Then it got harder and harder to even do that; it would take half the trip just to get Boingo to allow me to connect, and it would frequently drop out altogether and make my e-mail client fail.

The exception has been when I tried it on trips other than those at peak commute times. Boingo is surprisingly fast and reliable then (which is mostly irrelevant, since most of my trips are at peak hours). So clearly it’s an issue of capacity and overloading.

One day, while Boingo was giving me the usual frustrations, a new open access point popped up in my Wi-Fi drop-down menu: “Karma Wi-Fi.” It turned out to be a novel marketing strategy by a company that sells cellular internet modems. The connection was faster and more reliable than Boingo had ever been. But the monthly cost was significantly more, so I held off.

Then one evening I did some back of the envelope calculations based on how much my time is worth, and it was clear that the costs of the more expensive service would pay for themselves in about a fortnight. So I’ve taken the plunge, and it’s been as reliable as my free trial was. Time to call Boingo and cancel my service.

Time for Another “Javascript Sucks” Post

Published at 15:36 on 12 November 2015

For no other reason than it’s been a while since my last one. Oh, I’ve just been bitten by some sucky Javascript today, but that’s hardly new: it happens most days.

That’s because there’s an awful lot of sucky Javascript code out there. Which is the case because Javascript is a difficult language to program well in. Which in turn is the case because both the core language and its object model basically suck.

I mean, how much more sucky a design decision can there be than making all variables by default global unless explicitly made local? This is precisely the opposite of what any sane design would do. Thanks to this misfeature, all the Javascript coder need to is absentmindedly leave off a var or two and presto, there’s a bug waiting to strike. And Javascript’s parallelized and callback-based nature means the bug probably won’t be immediately obvious, so it will make it through testing and into production where it can bite users.

And then there’s the Javascript object model. It’s not necessarily deficient, it’s just bizarre. Well, bizarre to anyone used to the classical inheritance model (which means virtually any other object-oriented language out there); Javascript uses prototypal inheritance. It’s a bit of gratuitous difference that just makes Javascript needlessly strange and thus more difficult to learn and understand for the vast majority of programmers. Why? Just why?

That Javascript makes failure easy and success difficult can be illustrated by how it’s not that hard to run into bad (i.e. flaky and unreliable) Javascript on Google pages. I mean, if one of the largest and wealthiest corporations on the planet, one famous for hiring the best and brightest, one that writes browsers as well as web pages, can’t successfully implement a toolkit to tame Javascript’s bias towards failure outcomes, that’s about as damning an indictment as one can make against Javascript.

Finally, it’s instructive that those who designed the Google Web Toolkit chose to (alas, ineffectively) fight Javascript’s brokenness by writing a tool to enable developers to avoid programming in Javascript entirely.

Javascript code would be bad enough if most of it only had all of the above factors working against it, but wait, there’s more.

First, client-side Javascript has to be cross-browser portable, and there’s lots of gratuitous little differences between the execution environments on different browsers (or even differing versions of the same browser).

Second, many kinds of client-side code, in particular AJAX code, are difficult to write well. Such code must cope well with all sorts of network conditions without adversely impacting human-machine interactivity.

The latter would be difficult to do even with a sanely-designed programming language and execution environment. In Javascript, it’s often close to impossible.

It’s one reason why the answer to the question of how best to do something in client-side Javascript is a simple don’t.

Don’t do it. Not if you can do it in static HTML somehow. It might in some theoretical sense be better to have more interactivity, but in practice a less theoretically-elegant solution that works and works reliability in a wide variety of situations will come out ahead.

Re-Thinking Software-Defined Radio

Published at 10:04 on 14 October 2015

After blowing two evenings trying to get SDR working, I’m beginning to think I was correct in basically writing the technology off as not worth the trouble some years ago. I fight with computers in my day job. I don’t want to do it as a hobby.

First, I use Macs. If you use a Mac, you’re really left out. The vast majority of SDR software supports Windows and Windows only. The few exceptions tend to run on Linux and not Macs.

Sure, I could boot Linux on my Mac, but it’s Linux. That means it was written by hard-core geeks for hard-core geeks, so documentation is incomplete (if available at all). To prove my point, I tried to create a bootable Linux flash drive last night, following all the instructions meticulously. It didn’t work; it failed to even appear as a boot device when the system came up. That means there’s probably some missing step in the by-geeks, for-geeks instructions that was left out because it’s transparently obvious… obvious to a hard-core Linux geek that is. Figuring out the answer to that puzzle could easily involve me blowing my free time on it for the next several weeks. No thanks. I want to geek around with radio, not Linux systems administrivia.

The few exceptions, i.e. SDR programs that run on the Mac natively, tend to involve Mac Ports. Which is (link) currently broken. Sigh.

That leaves running Windows, which probably means buying and setting up a whole new computer. If it comes to that, there goes any cost advantage of SDR; even a sub-$10 dongle like the one working its way to me from Singapore will have a total cost about twice that of the Alinco receiver I just purchased. It actually might come to ruunning Windows… eventually. Right now, there’s higher priorities for spending that sort of cash.

The Smart Phone Era Will End… Eventually

Published at 08:14 on 7 October 2015

Why? Several reasons.

Just because something can be done does not mean it should be done. This reason is currently lying dormant, as ours is a technology-fetishizing society and we’re still in the stage of being wowed and dazzled by how smart phones are even possible.

Just because something can be done does not mean it is therefore fashionable and popular. Another one that is currently lying dormant due to technology fetishism, and probably a much more relevant one than the above. Eventually, the fashionable will decide not to carry smart phones. People like movie stars and politicians in high office don’t need them; they have assistants to handle such duties. Jettisoning the phone will be a fashion statement that they are powerful and affluent enough to have such assistants.

This will be much like having a suntan went from being a sign of a common farmer to being a sign of someone privileged enough to have lots of leisure time outside of factories and offices. Even if those without personal assistants still have to carry a phone with them, they will opt for phones that are as small and inobtrusive as possible.

When will this happen? Who knows. It could take another ten or twenty years. I don’t think it will take significantly longer than twenty. That’s a generation, which is long enough for a new generation to see smart phones and obsession over them as yet another dorky adult thing. At that point, the way will be paved for the newest, most fashionable entertainment figures to establish not carrying much personal technology as a fashion statement.

Does my personal bias play any part in my forecasting this? Almost certainly. Yet while I personally want the smart phone era to end, that doesn’t change how the above factors all exist and lie waiting ready to manifest themselves. And personally, I’d want the new trend to happen faster than ten or twenty years, yet I’m not forecasting it will begin soon. So it can’t be written off as purely personal bias.

A GPS for My Truck? No Thanks!

Published at 10:04 on 15 September 2015

I actually had a chance to try one out, for free. I’m not impressed.

First, it’s far more distracting than a map. The display animates to show your progress. This grabs my peripheral vision and distracts me from what’s going on outside. That’s more than just annoying: it’s unsafe. There’s no escape from the above drawback: the thing has to be mounted on the dash in order to “see” the satellites. If I put it on the seat so it doesn’t distract me, then I have to wait several minutes for it to locate itself every time I check my position. By contrast, a old-fashioned paper map stays out of the way when I don’t need it yet is there, instantly, whenever I wish to consult it.

Second, it shows only a tiny part of any map. It’s very difficult to get any overall idea of the layout of where I am by zooming out (lose detail) or panning (lose context). A big, old-fashioned map is much better in this regard.

Third, it’s expensive. I just bought a comprehensive street map of Kitsap County for $6. So far as addresses go, the phone company sends me a countywide phone book every year for free. Since I don’t need it at home (where I can use the Internet), I put it in my truck. Any decent GPS will cost about 15 times that much. Plus in a few years, the maps and address data inside the GPS will need updating. That costs $50 or $60, i.e. fully 10 times what acquiring a new map and phone book does.

Fourth, it’s limited. It shows but a subset of businesses and business categories. Compared to the phone book, it sucks. It also shows a very limited subset of points of interest like parks, lakes, etc. Compared to the index on my old-fashioned map, it sucks.

If I did more long-distance road trips, I could see such a thing having some utility despite its drawbacks, because it’s impractical to keep a detailed map for every last town you’re going through with you (and to acquire same in advance). But I don’t — so it doesn’t.

Software Quality (or Lack Thereof)

Published at 16:22 on 8 July 2015

For my paid work, I maintain a program which runs for a long time (essentially, indefinitely) making millions of socket calls per day and doing extensive amounts of text parsing (it’s a web crawler).

What impresses me is how often problems in my code are not really problems in my code: they’re problems in some library that my code calls. One time it was even a problem in the system libraries; socket I/O would work fine for a day or two, then some time from two days to a week in, socket calls would simply and mysteriously hang. Another repeated source of headaches was the LXML library, which tended to cause me all sorts of issues with memory leaks and indefinite looping and recursion.

This is in the open source (Linux) world, so it underscores a general lack of thorough testing. I consider it unacceptable that a program which makes about 2 million socket calls per day will fail due to a library bug after about 10 million calls on average. One should be able to make an indefinite amount of system calls (absent system quotas and limits, of course).

But apparently I’m somewhat unusual in having high standards like that. LXML has a (totally undeserved, in my opinion) reputation for robustness, and that faulty system library made it into a major CentOS release.

Or maybe I’m being unreasonable in expecting that a program which runs for an hour without running into issues should run for a day, a week, or a month without being cut down by memory leaks in the code it calls. (I assume it was a slow memory or other resource leak in the socket call case; it presents itself as a classic symptom of such.)

Synergy: Beyond Awful

Published at 11:20 on 17 June 2015

Well, scratch what I just said earlier about Synergy being a viable stopgap solution. It’s not even that.

It’s simply beyond awful. Not only is it slow and laggy, but keystrokes and mouse clicks randomly vanish and fail to get delivered at all. Programs start acting in bizarre and unpredictable ways: windows fail to show up when the keyboard shortcut that should make them appear is typed, cursors mysteriously vanish and reappear, the normal behavior of the finder when windows are clicked on becomes erratic, and so on.

I’m better off manually switching cables than subjecting myself to the user interface horrors of Synergy.

KVM Switches Are Not Obsolete

Published at 10:19 on 17 June 2015

Don’t let the techno-cheerleaders for products like Synergy fool you. KVM switches still are very much relevant.

For openers, they let you switch a monitor as well as a keyboard and a mouse. That’s a big plus for me. One of my computers is a laptop with a limited amount of on-screen real estate. It’s a huge plus to be able to add my desktop’s screen to it.

Second, there’s the hidden catch of network-based keyboard and mouse sharing: lag. Even though slight, it’s quite noticeable, and very annoying. The keyboard lag in particular has an adverse impact on my typing speed.

So it looks like Synergy’s place is as a stopgap solution until the replacement for my now-dead KVM switch arrives.

Oh, and if you’re interested in downloading Synergy, it pays to go to the link above and not the one that shows up at the top of web searches. That latter site tries to zing you for the privilege of downloading a version that’s more recent than a year old (and such versions are clunky and difficult to configure, at least on a Mac). You’re much better off using the recommended version from the site I linked.

Building Gnuplot on Mac OSX

Published at 09:22 on 20 May 2015

There’s not much out there on how to do it, and what’s there is either flat-out incorrect (fails to produce a binary that’s actually useful for anything, because it can’t directly display data on the Mac) or needlessly painful (involves things like MacPorts which end up needlessly bloating your computer by building most of the open source Linux universe first).

  1. Download and install the latest version of AquaTerm, available here.
  2. Download the most recent production version of the source for Gnuplot, available here.
  3. Type the following commands to build and install Gnuplot:
    ./configure --with-readline=builtin --with-aquaterm
    make
    sudo make install
    

Note the two options to configure. The built-in readline library in many MacOS releases is buggy and makes things crash, and the configure script is too stupid to automatically realize that Aqua Term is present. The hard part about building Gnuplot on the Mac is building a binary that’s actually useful; the configure script will by default merrily create a configuration which will crash on loading or cannot actually directly display anything on the Mac screen.

At least, this worked for me. I’d be interested in hearing whether or not it works for you.