Thinking about Privacy Policies

I am in the process of developing and publishing an Android app to the Google Play store. Part of the process of doing so is developing and publishing a privacy policy.

Initially, I thought this would be super-simple: Don’t collect information, then there is nothing to share or to establish policies about sharing. Simple. However, in the real world, things are seldom so simple as they might at first appear.

The first complication came when I realized that although my app does not (and probably never will) gather and pass on usage statistics, the places from which users might download my app, which will include a web site run by yours truly in addition to the Google Play store, certainly will gather such statistics.

Virtually every web server on the Internet logs each and every request it receives, and these log messages typically contain, at a bare minimum:

  • The time a request arrived.
  • The IP address the request arrived from,
  • The URL of the resource being requested, and
  • Basic information on the user agent (i.e. web browser) used to make the request. Such information typically includes the operating system that the user agent was running under.

So, say you are an AT&T customer in Brooklyn who uses your Samsung Galaxy S21 to download a copy of my app. I (or Google) will be able to tell from your IP address that you are an AT&T customer in the New York City metro area. We may even be able to tell that you were in the borough of Brooklyn, and that you were using a Galaxy S21. If we share your IP address with AT&T Wireless, they will be definitely able to determine exactly who you are, what hardware you used, where you used it, and (if you were doing something unlawful and/or abusive) take action against you for what you did.

Some Internet users are shocked to discover this. If you are one of those, consider yourself educated.

Why is this done? Not always for nefarious purposes! In fact, not usually for such. Gathering such data can be extremely useful for dealing with things like abusive users (they exist), troubleshooting software and network problems (they are inevitable), or managing the growth of traffic to a web site or to a cellular network.

But it’s still pretty simple, right? So I am collecting basic usage statistics (and Google Play will doubtless collect some on my behalf that it can share with me in reports). Just do not share the information!

Well, there is the matter that I could end up in jail on a contempt of court charge for adhering to such a policy: what if a law enforcement officer or a process server arrives at my door armed with a warrant or a subpoena?

Okay, then, exclude that and nothing else. Solved!

Not so fast, yet again! What if my app becomes popular with violent white nationalists and neofascists? I am, after all, promising to gather a fairly minimum amount of information and to be as reluctant as possible in sharing it; that makes my app attractive to such individuals.

It also makes it attractive to those breaking laws to undermine oppression and to advocate for more freedom, which is my main intent. If that sounds reckless to you, just ponder that any oppressive order has always considered it a crime to undermine said order; revolutionary politics is intrinsically criminal politics. Lech Wałęsa was a criminal; Martin Luther King was a criminal; Mahatma Gandhi was a criminal. If the Founding Fathers of the United States had failed in their endeavor, they would have been prosecuted and for the most part executed for the crime of treason against the British Empire.

The only exceptions to the above rule are certain situations when the revolutionaries are judged to be sufficiently tiny in number and powerless so as to pose little or no threat to the established order. And as soon as they gain enough power to cease being so, watch out! The velvet gloves will be replaced by an iron fist.

But I digress. So now I must craft an exception for things like neofascist and white nationalist politics. While I do not want to, and do not have any intent to, regularly monitor the download logs, I want to be free to cooperate with antifascist organizations should my cooperation prove helpful to the cause of fighting fascism.

That, of course, begs the question of just what, precisely “neofascist and white nationalist politics” is. However I define it, it opens up the prospects of all sorts of word games: “No, I am not a ‘fascist,’ you stupid leftist. I am a ‘nationalist’ and an ‘identitarian.’”

Now I am stuck trying to anticipate those word games, all the while also having a privacy promise that still is meaningful to the vast majority of people, even people whom I might politically disagree with, who are nonetheless not fascists and whose beliefs must be accepted as part of the diverse spectrum of beliefs in any free and open society.

In the real world, things are seldom so simple as they might at first appear.

Testing Android Apps

It leaves a lot to be desired.

The normal unit testing is advertised as supporting most of the Android class library (which is not the same as the standard Java class library), but what they don’t tell you is that it’s chock full of stub-out dummy logic. The routine to load an image from a file, for example, always returns a 100 by 100 black image. That’s sort of a deal-killer if one is trying to test image-processing code.

The instrumented testing runs on Android devices so avoids those headaches, but it too is extremely limited in scope and needlessly developer-hostile. For example, the test code is by default strictly disallowed from making any modifications to the filesystem. If one is testing an app that processes files, that again ends up being a deal-killer (how, exactly, am I to create the test files to feed to the app being tested)?

There are ways to disable this misfeature, but they are very poorly documented. It’s a setting buried deeply in an obscure settings menu somewhere. Where, exactly, is not standardized: it varies from device to device so much that one set of instructions is not even valid for a single Android OS release. I gave up in disgust after pissing away at least an hour searching in vain for it on my phone.

If Google wants developers to write good, comprehensive tests for apps, they need to stop making it as difficult as possible for us to do so. Until then, Google can take its pleading about writing tests and go fuck themselves. I will still write tests, but not very comprehensive ones.

Crimping versus Soldering

The world is full of analyses like this one that confidently perform crimping to be better than soldering. The real world is not nearly so simple.

Yes, a properly executed crimp connection with a quality crimp connector is by all measures superior. The devil is in those weasel words.

Given that it is possible for a crimped connection to be superior to a soldered one, and given that crimping is faster than soldering, why would anyone solder? Soldering when connections can be crimped seems obsolete.

That is how many retail hardware stores promote crimping, often in a big blister pack with cheap crimp connectors and a cheap crimping tool like this one. Well, good luck with that. It takes a skilled craftsman to execute a quality crimp with a cheapo tool and cheapo connectors. It is, in fact, easier to learn to solder.

An anecdote to close: When I worked in IT support, the department purchased a cheap crimping tool, that could crimp both 6 and 8-position modular connectors, and some bulk cable. No longer would custom lengths of cable need to be special ordered.

Those crimps were responsible for trouble ticket after trouble ticket. When I broke the crimpers in the attempt to exert enough force for a quality crimp, I put my foot down and insisted they spend over $100 on a name-brand, quality crimping tool and set of crimping dies. It was money well spent, because the number of trouble tickets dropped to zero on connectors crimped with it.

It’s not that bad with standard wire crimp connectors; $25 or so can get you a good, compound-action, ratchet-based crimping tool. Even then, it’s good to budget in some practicing, and learning how to recognize a bad crimp. But again, that’s not how crimping is sold. Most of those crimp kits don’t even cost $25 total, and no mention is made of skill development.

Personally, I solder. Already have a soldering iron and know how to use it as a result of messing with electronics for many years, and I don’t splice wires often enough to justify the expense of a crimping tools, the clutter managment headaches of maintaining a stock of crimp connectors, and so on.

What Sucks Most about GUI’s: No Current Directory

Say you’re doing some coding and debugging using the command line. You change directories to where the source code is and thenceforth, until you explicitly change to some other directory, you have set the default directory for all editors, compilers, debuggers, file-filtering utilities, and so on. It’s automatically understood that, unless you explicitly specify otherwise, if you specify a file name, it will be in the current working directory.

No graphical user interface that I’ve used has had anything analogous. As such, I’m forced to continually spend effort manually specifying source and destination directories. This constitutes repeated work that I don’t have to do when using the command line.

Each utility starts out relative to the home directory, or, in some cases, the directory where it was last used. It’s hard to say which is worse, but I think my vote goes for the latter: it’s a pathetic attempt at user-friendliness that ends up being incredibly user hostile: in order to predict where the files for a given application will go, I must memorize, for each application, where the last file I specified with it lived.

It all results in continual violations of the principle of least surprise. I’m persistently having to use find to determine which seemingly random and bizarre place a GUI application just created an output file in.

This, more than anything else, is why, decades into the existence of the graphical user interface, I inevitably have one or more terminal windows open, and opt to do a significant amount of my work at the command line.

Scaling Images in Java and Kotlin

It’s sort of confusing. There’s a lot of ways to do it. And by “a lot” I mean a lot.

Despite the range of options, my initial attempts at downsizing images were disappointing, yielding ugly results. As an example, consider this:

To fully appreciate how bad that is, here’s what Gimp produces when asked to do the same thing:

Yes, the same thing. In both cases, I am requesting the image be scaled with the bicubic algorithm. Clearly, there is more processing going on in Gimp than simply bicubic scaling. Despite my attempts, I kept getting this sort of disappointing result as I experimented with the various ways of doing it.

Finally, I ran across a magic combination of API calls that yields acceptable results:

val nWidth = (imageIn.width * ratio).toInt()
val nHeight = (imageIn.height * ratio).toInt()
val imageOut = BufferedImage(nWidth, nHeight, imageIn.type)
imageOut.createGraphics().run {
    drawImage(imageIn.getScaledInstance(nWidth, nHeight, BufferedImage.SCALE_SMOOTH), 0, 0, null)
    dispose()
}

That code produced the following:

Not quite as good as what Gimp yields, but close. The takeaway is that getScaledInstance does at least some necessary extra processing, above and beyond the standard scaling, which is needed to produce acceptable results. I probably wouldn’t want to use it for high-resolution production work, but for generating screen-resolution images for sharing on the Web it’s perfectly adequate.

Update: Some more research reveals that getScaledInstance  with BufferedImage.SCALE_SMOOTH doesn’t use the bicubic algorithm after all; it uses an area-averaging algorithm which is even slower. Those details are mostly irrelevant, however: the important thing is that for my application, it delivers acceptable quality in an acceptable amount of time, which is more than can be said for all the other API calls.

Linux: Still Linux (Alas)

Mind you, I’d really like it if I could wholeheartedly endorse Linux as an alternative to Windows or MacOS for a general-purpose desktop operating system. But I just can’t.

Linux is great for some things. Servers, for instance. I run a Linux server at a colocation site for a variety of purposes. It was basically a no-brainer: it’s a rock-solid server OS. Linux on the desktop has improved to the point that for basic use (e.g. browsing the Web, reading email, maybe typing a document or two, or downloading and editing digital photos) it is now a totally viable alternative to Macs or Windows.

The problems happen when one moves beyond basic desktop use: one all-to-quickly ends up in a maze of twisty little passages of UNIX system administration arcana. Hardware support, in particular, seems to be a bane of Linux. I couldn’t even get one of the most common digital radio interfaces running with one of the most common ham radio applications on one of the most common desktop Linux distros!

Yes, yes: there’s distros expressly designed for ham radio. Well, what if I want to use that computer for more than just ham radio? I’m S-O-L, that’s what: instead of delving into system arcana trying to get ham software working, I’ll doubtless be delving into system arcana trying to get normal desktop productivity software running.

In fact, the very existence of such ham radio-specific distros puts the lie to the claim that Linux interoperates well with ham radio hardware. If Linux did interoperate well, it wouldn’t be necessary to create such specialized distros in the first place! (Why create a specialized distro, if all one needs to do is install a few packages and make a few quick, easy tweaks to a mainstream distro?)

Then there’s my experiences with the Raspberry Pi. Not having an HDMI monitor, and not wanting to clutter up my limited space with one, I opted to order a serial interface cable with my Pi. It worked: the Pi booted and used the serial console when I connected it. Until they “upgrade” the Raspbian distro to remove that feature, that is, and fail to properly document how to re-enable it. After pissing away half a week trying to get the thing to boot on the serial console, I give up.

Forget it. I retired from systems administration because I was sick of it. Doing systems administration for “fun” as a “hobby” holds precisely zero appeal for me. If it doesn’t work with a modicum of effort on my part, I’m simply not interested. Ham radio is the hobby. Linux systems administration is not.

Linux has definitely gotten better as a desktop system over the years, but it’s still not fully there. Sorry, fanboys.

I Think I’m Starting to See a Pattern Here…

I’m trying to package a Java program I wrote so it makes a nice, professional-looking “clickable” app, complete with a custom icon.

First up was the Mac. The Oracle-furnished packaging tools were buggy and did not exactly work as documented, but I finally managed to make a (crappy) package from them.

Then came Linux. At first I was at a loss as to what to do, then I decided to crib the package bundler that the jEdit build files used. It was a huge struggle, because it was your typical open-source project, almost completely undocumented. Eventually I managed to get it to limp along to completion and make a (nonworking) Debian package.

A day of struggle followed, trying to make the nonworking package work. Eventually I gave up on the bundler and decided to make a Debian package completely from scratch. That was surprisingly easy compared to the crap software I had been fighting with.

Then back to the Mac. Would the bundler that the jEdit team used do any better a job than the stock one shipped with the JDK? No, it would not. So I looked into what made a Mac application bundle tick, and it wasn’t that complex. The biggest hurdles were (a) finding the magic keyword to search on (“bundle” in this case), and creating an Info.plist file (doable once I located the documentation for them).

So I built that one totally from scratch, too. So now I’m two for two at it being less work to “re-invent the wheel” than it is to use an existing, off-the-shelf solution.

Next up: Windows. Just for yucks, I’ll give Launch4j a whirl, though based on my recent experiences, I don’t expect it to work, and I’m not planning on investing much time in trying to make it work, either. Who knows, maybe I’ll get pleasantly surprised. (Then again, probably not.)

Update: Well, I’ll be. Launch4j actually proved to be a time-saver. The most obnoxious thing about it is a bizarre insistence on four-part version numbers, but it turns out that’s a Windows thing (and it is documented), so it’s not the Launch4j team’s fault.

Ubuntu LTS 18 “Bionic Beaver” Font List

There’s no shortage of resources out there listing the standard fonts on various versions of Windows and MacOS, but references for the the same information about Linux seem to be very scarce. For the record, here’s what fonts are present on a freshly installed Ubuntu LTS 18 system (a “typical” install, which includes Libre Office):

aakar
Abyssinica SIL
Ani
AnjaliOldLipi
Bitstream Charter
Century Schoolbook L
Chandas
Chilanka
Courier 10 Pitch
DejaVu Math TeX Gyre
DejaVu Sans
DejaVu Sans Condensed
DejaVu Sans Light
DejaVu Sans Mono
DejaVu Serif
DejaVu Serif Condensed
Dialog
DialogInput
Dingbats
Droid Sans Fallback
Dyuthi
FreeMono
FreeSans
FreeSerif
Gargi
Garuda
Gubbi
Jamrul
KacstArt
KacstBook
KacstDecorative
KacstDigital
KacstFarsi
KacstLetter
KacstNaskh
KacstOffice
KacstOne
KacstPen
KacstPoster
KacstQurn
KacstScreen
KacstTitle
KacstTitleL
Kalapi
Kalimati
Karumbi
Keraleeyam
Khmer OS
Khmer OS System
Kinnari
Laksaman
Liberation Mono
Liberation Sans
Liberation Sans Narrow
Liberation Serif
Likhan
LKLUG
Lohit Assamese
Lohit Bengali
Lohit Devanagari
Lohit Gujarati
Lohit Gurmukhi
Lohit Kannada
Lohit Malayalam
Lohit Odia
Lohit Tamil
Lohit Tamil Classical
Lohit Telugu
Loma
Manjari Bold
Manjari Regular
Manjari Thin
Meera
Mitra Mono
Monospaced
mry_KacstQurn
Mukti Narrow
Nakula
Navilu
Nimbus Mono L
Nimbus Roman No9 L
Nimbus Sans L
Norasi
Noto Color Emoji
Noto Mono
Noto Sans CJK HK
Noto Sans CJK JP
Noto Sans CJK KR
Noto Sans CJK SC
Noto Sans CJK TC
Noto Sans Mono CJK HK
Noto Sans Mono CJK JP
Noto Sans Mono CJK KR
Noto Sans Mono CJK SC
Noto Sans Mono CJK TC
Noto Serif CJK JP
Noto Serif CJK KR
Noto Serif CJK SC
Noto Serif CJK TC
OpenSymbol
Padauk
Padauk Book
padmaa
padmaa-Bold.1.1
Pagul
Phetsarath OT
Pothana2000
Purisa
Rachana
RaghuMalayalam
Rekha
Saab
Sahadeva
Samanata
Samyak Devanagari
Samyak Gujarati
Samyak Malayalam
Samyak Tamil
SansSerif
Sarai
Sawasdee
Serif
Standard Symbols L
Suruma
Tibetan Machine Uni
Tlwg Typist
Tlwg Typo
TlwgMono
TlwgTypewriter
Ubuntu
Ubuntu Condensed
Ubuntu Light
Ubuntu Mono
Umpush
Uroob
URW Bookman L
URW Chancery L
URW Gothic L
URW Palladio L
utkal
Vemana2000
Waree

Stop OSX Catalina From Shifting the Display

Keywords: OSX Catalina, Macintosh, hide menu bar, display, screen, shift, feature, disable.

TLDR: It’s an accessibility feature called Zoom. Look in System Preferences… Accessibility… Zoom and disable any gestures or keyboard shortcuts pertaining to Zoom.

As soon as I upgraded my newer Mac to Catalina, it started happening: whenever the mouse cursor got close to the top or the bottom of the screen, the display would shift slightly, by 20 or 30 pixels or so.

It lent an overall air of sloppiness to the whole user experience, yet it was obviously an intentional (mis)feature of some sort, because implementing it is non-trivial in code (it requires moving a lot of data around in video memory). There simply was no conceivable way this could happen as the result of a common coding bug. Finally, it had never happened to me before I upgraded to Catalina, and now it always happened, but only on the newer Mac that ran Catalina. The old Mac (which cannot be upgraded, due to it no longer being a supported product) simply never developed this behavior.

So I started looking through the system preferences for the obnoxious new feature. It wasn’t in the “General” or “Desktop & Screen Saver” sections, and I couldn’t see any other obvious place where it might be; nothing else obviously controlled a display issue like this.

The next step was attempting to find an answer via a search engine, but I also kept coming up dry. I gave up, having pissed away well over an hour on the issue by that time, and decided to try living with the misfeature.

But it was annoying, extremely annoying. I like to keep track of the time by looking at the digital clock on the right-hand side of the menu bar, yet the misfeature meant that about half of the menu bar was not visible, which typically made the clock illegible. I could address this by moving the mouse cursor up to the top of the screen, but it’s annoying to have to do that. I shouldn’t have to mess with my pointing device just to see the time of day.

So, I kept revisiting the issue, hoping to come up with the magic keyword that would eventually come up with the solution. Nothing ever worked.

Eventually, I broke down and posted something to Reddit, making sure to be irate and whiny (past experience has shown that an irate tone is more likely to generate responses for such questions).

Sure enough, it was a deliberate feature, one related to an accessibility (for the disabled) feature called, of all things, “zoom,” which is why I had been unable to locate it, or even find out about it via a search. I would have never guessed that shifting the screen like that had anything to do with zooming or magnifying the screen.

So many modern user interface design techniques come across as completely bizarre and counterintuitive to me. I don’t think OSX would even be a usable GUI to me, were it not for how I’ve disable feature after feature in it in the settings over the years.

Why Swing? Why not JavaFX?

I recently decided to finally take a serious crack at developing a GUI application.

My first choice was to make it a native Mac application, and write it in Swift. I soon was reminded, by fresh personal experience, of what I had discovered the last time I tried to code a native Mac application: that Macs are approximately as programmer hostile as they are user friendly. Documentation was patchy and incomplete. Interfaces were bizarre and counter-intuitive. Worse of all, things change radically from release to release of the OS, to the point that most of the documentation out there is basically useless, because it is for MacOS releases prior to Catalina.

I could have persevered, but it was clear that MacOS app development is a dark art that takes a lengthy and painful initiation process to cultivate. No thanks; I just want to get my app coded and finished. Would have been nice to have a native app that dovetailed as nicely as possible with the rest of the system, but being able to finish it in a timely manner takes priority.

I had in the past year ran across Kotlin, which struck me as a well-designed effort to modernize Java (or, alternately, Scala done right: a more modern language for the JVM that avoids the pitfall of creeping featurism that led Scala to be excessively complex). Java has long supported portable GUI application development, and since Kotlin is a JVM language, you can easily call any of the Java libraries from Kotlin. So Kotlin it was.

It didn’t take that much research to determine that the new and supposedly preferred way to code graphical user interfaces in Java was JavaFX. So I went to the JavaFX web site, downloaded JavaFX, and typed in the “Hello, World” example listed on that site.

It didn’t work. After double- and then triple-checking what I had done, I could see zero discrepancies between what I was doing and what the tutorial was telling me to do.

Noticing that the current release of JavaFX hadn’t been out all that long, I tried regressing to the previous version now on long-term support status. The code still didn’t work.

I tried posting a query on Reddit as to what I was doing wrong. Nobody had any idea.

I reported the bug via the project’s GitHub page. That prompted a curt, incomprehensible, acronym-laded response to file the bug report some other way (and my bug report on GitHub was perfunctorily closed).

Eventually, I got the example code to produce the output it should, by regressing to the version of JavaFX that was distributed with JDK 8. Then I started investigating how I’d code my program in JavaFX.

It quickly became apparent that one of the things I needed to do would probably involve a lot of work in JavaFX, but that there was a Swing UI component that did basically all I wanted, and that it was easy enough to embed Swing components in a JavaFX application. While doing that, I ran across yet another bug in JavaFX.

But why? I had already established that JavaFX is full of bugs, insufficiently documented, and has a development team whose attitude about quality is lacking. Now I learn I can’t even code it all the “new way” even if I desire to, because JavaFX is lacking in basic features as well.

Moreover, it didn’t take much research to uncover that Oracle (the single most important player in the Java development process) makes massive internal use of Swing, and has no plans to remove support for Swing anytime soon. In other words, Swing is definitely here to say; rumors of its deprecation have been greatly exaggerated.

So that’s what I used. Maybe, in a few years, if I have occasion to write another graphical user interface, I will investigate if JavaFX is any closer to being ready for the prime time than it currently it is.

I can tell that some of what JavaFX is trying to accomplish would be a real improvement. It’s a pity that the current state of that project is evidently so lacking in features and quality control, but there you have it.