On Specializing AKA Databases Are Easy Now

After hanging my shingle in the independent consulting business — trying to pay the bills while exploring entrepreneurial efforts[1] – one of my original specialties was high performance database systems.

I pitched my expertise primarily in large-scale high-demand database systems, providing solutions that would dramatically improve operational performance and relieve pain points. Just as a facet of my network of contacts this was usually in the financial industry.

It made for relatively easy work: Many groups don’t understand how to properly leverage their database products, their data needs, or their query profile, so I could often promise at least a 50x performance improvement in some particular pain point on existing hardware, and achieve that and more with ease.

I started backing away from the database world, however, because many database products have dramatically improved and reduced the ability to shoot one’s own feet, but more significantly hardware has exponentially improved.

From massive memory, caching the entirety of all but the most enormous databases, to flash storage (and now Optane storage) that makes the most inefficient of access patterns still effectively instant. Many have gone from a big RAID array that could barely manage 1000 IOPS, on a platform with maybe 32GB of RAM, to monster servers with many many millions of IOPS and sometimes TBs of very high speed memory.

It has gotten hard to build a poorly performance database system, at least at the scale that many firms operate at. Expert implementations can still yield significant speedups, but to many potential customers a 100ms query is effectively identical to a 1ms query, with low enough loading that hardware saturation isn’t a concern (or the hardware so inexpensive that a cluster of ten huge machines is fine despite being significant overkill if properly implemented). The most trivial of caching and even a modicum of basic knowledge is usually enough to achieve workable results.

Egregious inefficiency is operable. Which is a great thing in many cases, and is similar to our move to less efficient runtimes and platforms across the industry. There are still the few edge situations demanding extreme efficiency, but as with assembly programming it just isn’t enough to demand a focus for many.

The possible market for database optimization just isn’t as large or as lucrative. And when things are awry on that millions of IOPS system with 2TB of RAM and 96 cores, it’s usually so catastrophically, fundamentally broken that it isn’t really a fun pursuit, usually necessitating a significant rebuild.

For the past half year I’ve moved primarily to cryptocurrencies and their related technologies (e.g. public blockchain implementations and standards, as well as payment systems). Not chasing a “whale”, or hoping to ride a bubble to great riches, but rather becoming completely knowledgeable of every emergent standard, and of the currently dominant platforms and their source code, and offering that knowledge to individuals and firms looking to integrate, to leverage, or to bet. It has been rewarding.

All the while I continue with the entrepreneurial efforts. My latest is with the iOS ARKit, which has been a fun experience. I should announce something on that soon.

Otherwise I continue with fun projects. I still have a contour mapping application on the go. It started as a real estate sales project, but that went so quickly to great success that it became just an interesting project to explore Kotlin, undertow, targeting iOS and Android and then learning about pressure propagation, etc. Then WebGL mapping, millisecond time synchronization (I use four current smartphones in concert), etc. I have a mostly complete posting on that, long promised, that I’ll finish soon.

[1] – I hit some financial roadblocks, leading to a crisis, when I put aside all revenue production that try to focus entirely on entrepreneurial efforts that seemed, for so long, on the absolute cusp of success. Lessons learned.

Glorious Fall In the Northern Hemisphere

Everything is spectacular right now, and I’m hard at work on some exciting projects. I hope to finish up a barometric land topography piece (an enjoyable side project that I mostly used to get very acclimated with Kotlin, and then decided to use as a vehicle to play around with the iPhone 8 and re-acclimate with xcode and swift) within the week and it should make for some interesting content, but aside from that I hope to be very prolific in my creations over the next while, having the time and attention to make the most of opportunities.

Thank you to all who have queried on the quiet — it will be worth the wait.

Embrace AMP or AMP Wins

I’ve written about AMP (Amplified Mobile Pages) on here a few times.  To recap, it’s a standard of sort and coupled set of code.

If you publish a page through AMP, you are limited to a very narrow set of HTML traits and behaviors, and a limited, curated set of JavaScript to provide basic functionality, ad networks, video hosting, metrics, with scripts hosted by the Google owned and operated cdn.ampproject.org. You also implicitly allow intermediaries to cache your content.

If you search Google using a mobile device, links with a little ⚡ icon are AMP links that will be loaded from the Google cache, and by rule (which is verified and enforced) live within the limitations of AMP. You can’t claim AMP conformance and then resort to traditional misbehavior.

The news carousel is populated via AMP links.

Many publishers have gotten on the AMP bandwagon. Even niche blogs have exposed AMP content via a simple plug-in.

AMP is winning.

But it has significant deficiencies, for which it has earned a large number of detractors. There are technical, privacy and web centralization issues that remain critical faults in the initiative.

Anti-AMP advocacy has reached a fevered pitch. And that negative advocacy is accomplishing exactly nothing. It is founded in a denial that is providing a clear road for AMP to achieve world domination.

Because in the end it is a better user experience. Being on a mobile device and seeing the icon ⚡ is an immediate assurance that not only will the page load almost instantly, it won’t have delay-load modal overlays (Subscribe! Like us on Facebook!), it won’t throw you into redirect hell, it won’t have device-choking scripts doing spurious things.

Publishers might be embracing a Pyrrhic victory that undoes them in the end, but right now AMP makes the web a more trustworthy, accessible things for users. It is a better experience, and helps avoid the web becoming a tragedy of the commons, where short-sighted publishers desperate for a short-term metric create such a miserable experience that users stay within gated communities like Facebook or Apple News.

We could do better, but right now everyone has exactly the wrong approach in confronting AMP.

“We don’t need AMP: We have the powerful open web, and publishers can make their pages as quick loading and user-friendly as AMP…”

This is a losing, boorish argument that recurs in every anti-AMP piece. It is akin to saying that the EPA isn’t necessary because industry just needs to be clean instead. But they won’t. AMP isn’t an assurance for the publisher, it’s an assurance to the user.

AMP isn’t a weak, feel-good certification. To publish via AMP you allow caching because that cache host validates and forcefully guarantees to users that your content lives within the confines of AMP. You can’t bait and switch. You can’t agree to the standard and then do just this one small thing. That is the power of AMP. Simply saying “can’t we all just do it voluntarily” misses the point that there are many bad actors who want to ruin the web for all of us.

But the argument that as a subset it therefore isn’t needed — missing the point entirely — is self-defeating because that argument has short circuited any ability to talk about the need that AMP addresses, and how to make a more palatable, truly open and truly beneficial solution.

We need an HMTL Lite. Or HTMLite to be more slogan-y.

The web is remarkably powerful. Too powerful.

We have been hoisted by our own petard, as the saying goes.

With each powerful new innovation in web technologies we enable those bad actors among us who degrade the experience for millions. For classic textual content of the sort that we all consume in volumes, it is destructive to its own long term health. Many parties (including large players like Apple and Facebook) have introduced alternatives that circumvent the web almost entirely.

A subset of HTML and scripting. A request header passed by the browser that demands HTMLite content, with the browser and caching agents enforcing those limits on publishers (rejecting the content wholesale if it breaks the rules).

We need to embrace the theory of AMP while rejecting the centralized control and monitor that it entails.

This isn’t simply NoScript or other hacked solutions, but needs to be a holistic reconsiderations of the basics of what we’re trying to achieve. Our web stack has become enormously powerful, from GL to SVG to audio and video and conferencing and locations and notifications, and that is just gross overkill for what we primarily leverage it for. We need to fork this vision before it becomes a graffiti-coated ghetto that only the brave tread, the userbase corralled off into glittery alternatives.

Summer Draws To A Close

I hope those of you in the Northern Hemisphere had a glorious summer. And for those in the Southern, I hope a great summer awaits.

Now that the summer break draws to a close, it’s time to get back on projects and roll out some deliverables. Two personal projects I’ve committed to delivering in the near future are a video app — for gradual, trickle monetization1 reasons — and a multi-device contour mapping/property mapping app leveraging reference barometers, GPS and GLONASS, where available. That one is just for fun. Along with various professional things (I’m available for your projects). And I’ll start spinning up content on here.

Otherwise it’s been a period without much to talk about on here.

One of the few things really interesting over the summer has been the adoption of Kotlin as a first-class language in Android Studio 3. Kotlin is a product of JetBrains, the creators of the excellent IntelliJ IDE (which Android Studio is based upon), and it’s a language I didn’t pay attention to previously: I veer away from tools and languages that require less common dependencies when handed off to other teams, and it can be a problem during technical due diligence. Now that Kotlin is more accessible for a major platform, I finally took the dive in learning and adopting it for those projects where the JVM or similes (e.g. Android) are a part of the solution.

And it’s actually a really compelling language. The sort of fluid, intuitive programming that is similar to Go programming for me. It dramatically reduces the enormous trove of boilerplate that Java often demands.

Ultimately programming languages are largely interchangeable. Almost anything can be implemented in just about any language. The number of lines will vary, the readability fluctuate, etc, but in the end you don’t have to ever change languages. But there is something almost indescribable about languages like Go, and now Kotlin, where the implementation is so fluid with your thought process that it simply makes implementing more complex solutions effortless. I am a big fan of Go but readily acknowledge that it has an enormous litany of deficiencies, but something about it just makes great solutions appear. Other languages have great features on paper, yet seemingly nothing of consequence ever seems to be created with it. Kotlin is unique in that not only does it bring that power, it also has a pretty compelling set of modern features.

It is hardly perfect, of course: Its construction was clearly bounded by the limits of the JVM (though you can target native code and several other platforms), so it doesn’t have the greenfield benefits of something like Rust, but it is an enormous improvement over Java when I have to work in that domain.

1 – A big change in focus of my efforts is that I’m going to focus far more on sustainable recurring income (both in the hired help and built apps for monetization variety), versus “shoot for the moon” type initiatives. Technology sales are unbelievably long, drawn out, and risky, with a process that is almost impossible unless you’re willing to be personally “acquired” in the transaction, committing to moving in the process (which I am not willing to do. Short term on sites are fine, but changing countries is not). Thousands of hours and in the end the roadblocks end up being the most trivial of things, all of it distracting from other income sources. Consulting efforts have their own gamut of problems, but some recurring revenue is far better than remote odds of an occasional large jackpot.

Technology and Population Density Trends

A bit of a rambling, conversational piece today.

Amara’s Law states-

We tend to overestimate the effect of a technology in the short run, and underestimate the effect in the long run.

We declare that a technology changes everything, realize that entrenched patterns/behaviors and small hangups limit broad adoption. We discount it as over-hyped and not as significant as expected. It quietly takes over and changes the very foundations of society and our lives.

Recently I was pondering what electric cars and self-driving cars would do to population density. The former — using mechanically simpler vehicles with a much less expensive energy source — will significantly reduce the cost of driving as it achieves the economy of scale, while the latter will reduce the inconvenience of driving, commuting in particular.

Self-driving cars will not only allow us to do other things during the ride, it will significantly increase the capacity of our roads to handle traffic by reducing human error and inefficiencies.

Intuitively, at least to me, these changes will propagate lower density living. That home that previously would have been an expensive, three-hour commute becomes a relaxing period to watch a Netflix series or catch up on emails.

Considering the probable social change of self-driving EVs led me to consider the changes over the past several decades. In Canada, as an example, lower density areas of the country — the Atlantic provinces, rural areas, small towns and villages — are hollowing out. The high density areas, such as the Golden Horseshoe around Toronto, is a magnetic draw for all of Canada and continues growing at a blistering pace.

Even if a home in the Toronto area costs 5x the price for a given set of amenities, and even if a hypothetical person might prefer lower density, many forces still draw them in.

Which is strange, in a way. I grew up in a small city and seemed to be completely isolated from the larger world. Calling a relative 20 minutes down the road necessitated long distance. My town had no computer store, a mediocre BBS, few channels on television, no radio station, etc. There were few resources for learning.

I was wide eyed at the options available in the big city.

Yet today we live in a world where that same small town has inexpensive 100Mbps internet, and can communicate with anyone over the globe in an instant. Where you can order just about anything and have it the next day, or even the same day. Every form of entertainment is available. Every resource and learning tool is a couple of clicks away (aside — education is one area that has yet to see the coming change from the new world). Few of the benefits of the density are missing.

But those same changes led to centralization, and a hollowing out of most of the better jobs, entailing the workforce having to follow.

We centralized government and administration, pulling the school boards and government offices, banks, etc, out of those small towns in the quest for efficiency, moving up the density ladder. Those five small villages amalgamated to a single board, that then got pulled into a larger board in the city an hour up the road, etc. Connectivity means that management for the few remaining auspices of structure can be at a far flung location.

Every medical specialty was moved to larger centers as the ownership of cars became prevalent, and long drives were accepted. Seeing a pediatrician 200km away becomes a simple norm. Service and even retailing gets centralized to some unknown place elsewhere on the globe.

Everything centralizes. Because it can.

Most decent jobs require a move to density. The same forces that gave the convenience of the city in far flung locations also relegated it to being essentially a retirement home.

Reconsidering the probable change of EVs and self-driving cars will likely accelerate that migration.