top of page
Underwater cave 970x520.jpg

SHIPS LOG

Welcome To
George Moakley
Blog Page

Wall Panel GM WEB w krankenlayered.jpg
Search

I read to be entertained, so I write to entertain.

What I love most about quality fiction, and my highest aspiration as writer, is the ability to immerse the reader in a world they’ve never known, and to make it so real they feel like they’ve really been there.

This applies to science fiction, historical fiction, fantasy, a detective story set in a city you’ve never visited, ...

Anything, really.

Achievement of such immersion is an art.

Failure is all too easy.

Poorly crafted writing will do it. Typos, poor grammar, repetitive words and phrasing, anything the jars the mind and thereby breaks the mood. A talented editor is your best strategy (thank you, Abigail!)

A greater challenge is credibility, which, in turn, breaks into two overlapping buckets (Venn diagrams, anyone?)

The first is accuracy.

Whatever your profession, whatever your pastimes, you have expertise.

When the story involves something that is, for your reader, an area of expertise, and you’ve failed to do your homework, you risk breaking the mood and losing your reader.

The second is canon.

As you weave your story, as you build your world, you set precedents.

And breaking those precedents also breaks the mood.

For hard science fiction, there’s a lot of overlap between these buckets. The more your world building is committed to real science, the easier it should be, at least in theory, to honoring canon.

But honoring canon is just as important regardless of genre.

If you, early in your story, introduce a notion about what’s lethal to vampires or how magic works or how, historically, elves and dwarves have gotten along, you just can’t change those rules on the fly.

I’m a particularly challenging audience in this regard for two reasons.

One is that I’m a detail oriented guy and the other is that I have a diverse resume.

My background is in evolutionary biology and theoretical ecology, but I went into the tech industry. I worked in precious metals exploration modeling ore bodies. I managed data centers and telecommunications for an aerospace company. I worked as a strategic planner and solution architect in semiconductor manufacturing and professional services. I ran R&D for a tech company developing prototypes and filing patents related to distributed computing, IoT, and AI. I’m a dive instructor (not currently active). I’m a photographer.

I could go on.

Bottom line? There are a LOT of things that come up in books, movies, TV shows, etc., that I know enough about to recognize when an author hasn’t applied due diligence and I’m sufficiently detail oriented that it kills the story for me.

The movie’s set in southeast Asia but the actors are walking past central American trees or rock formations unique to somewhere in the US?

Drives me crazy!

The hacker starts banging away at a keyboard and somehow magically breaks into an alien spacecraft’s network?

No way!

We’re on an alien moon filled with absolutely gorgeously envisioned alien lifeforms but, for some reason, despite ALL the vertebrate alien life forms having 6 limbs the intelligent life forms are humanoids with only 4 limbs? And, somehow, they’re able to mate with their hair and the same braid ‘joining’ is how they interface with whichever 6 limbed life forms they’re riding?

C’mon!

And it REALLY drives me up the wall when there’s a fundamental and unexplained capabilities of our heroes or villains. Somehow, unexplicably, an attack the monster was able to shrug off at the beginning of the story is now deadly. Or (see if you recognize the movie) the hero tells us, early on, that the surviving armed and trained space marines cannot rescue their friends but, somehow, miraculously, is able to singlehandedly rescue the little girl that’s become her substitute daughter.

I mean, I still LOVE these movies, but find these things jarring every <censored> time I watch these otherwise fantastic movies.

I could go on...

So, as a writer, what can you do?

One reason I love hard sci fi is that the story could really happen!

Another is that it should be easier to preserve canon if canon is based on facts.

But regardless of genre, being that detail oriented guy and being a hyper organized dude, I believe in documenting canon.

As I write my stories, I have a growing compilation I use as a reference.

Character backstories. Detailed designs of every vehicle, every structure. Detailed descriptions of the taxonomy, structure, and behavior of every organism described in every story. A high level future history. I’ve even modeled the populations for every colony referenced in the stories throughout our solar system and beyond. I’ve referenced typical birth and death rates under varying conditions. Space allocations on ships. Agricultural productivity per hectare.

EVERYTHING I could think of.

And a lot of it never shows up in my novels, but I record it anyway. Sometimes to help me envision things accurately for those details that DO show up in the stories, and sometimes because, well, you never know, do you? Something recorded for thoroughness might become relevant to a future story, right?

All of this said, I do recognize that it’s certainly possible that I’ve fallen short of my own ideals.

There are times I’ve done my best to be scientifically accurate, but I’m particularly concerned that I may have fallen short with respect to some aspects of physics and astrogation.

And, if I have, I apologize.

But I’ll say this.

If I have fallen short with respect to specific aspects of physics and astrogation, I have certainly remained consistent.

Because, if nothing else, whatever you do, DON’T BREAK CANON!

 
 
 
  • gpmoakley
  • Oct 26, 2025
  • 5 min read

I spent a lot of years in the tech industry, and, for a lot of those years, my job was strategic planning.


So, a recent social media post that reminded me most of us don’t recognize that we are living through an ongoing profound socio-economic upheaval, primarily because the last few years have been a period of relative stability within the punctuated equilibrium of that upheaval.


Neither the agricultural nor the industrial revolution happened overnight. People didn’t go to sleep one night as nomads and wake up the next morning with plowed fields, nor did they ride horses home one night and go back to work the next day in an automobile.


These transitions took decades.


So, when I read this post (essentially, ‘if the economic policies of the current administration are bad, why is my retirement account doing so well?’), it reminded me that, while most of us recognize that we are living through an information revolution, most of us aren’t recognizing it as a succession of automation waves.


It’s instructive to understand the ongoing information revolution as a punctuated equilibrium. It has been, and continues to be, a series of upheavals facilitated by technological innovations but driven by productivity gains realized through successive waves of digitization.


Think of any organization, business or government, as a set of processes.


Then think of those processes as groupings of activities performed by humans, with one grouping after another digitized by a set of technologies.


Then recognize that each wave of digitization, by displacing humans, improves organizational productivity by running faster, better, and cheaper.


But not without profound socioeconomic disruption.


In the 1950s, a ‘computer’ was a human, armed with a slide rule tasked with performing calculations related to maintaining the general ledger, populating actuarial tables, and other bulk calculatory tasks.


These jobs were replaced by mainframe computers.


Offices used to pay people to maintain rooms filled with filing cabinets storing documents related to all sorts of records related to human resources, customer accounts, payment processing, and so forth.


These jobs were initially replaced by minicomputer based departmental systems that have now, in turn, been replaced with ERP systems like SAP.


Offices used to pay administrative assistants to take dictation, prepare documents, generate presentations, and host of other functions replaced by office suite software products running on personal computers.


Companies used to pay people to generate and distribute physical catalogs, and other people to receive and process orders received by mail and then phone calls. These jobs were replaced by the Internet and e-commerce.


Again, each wave of digitization has, by replacing humans, improved organizational productivity.


Each wave has, by replacing humans, created profound disruptions. One set of people discovered opportunities related to enabling and maintaining the automation systems that put another set of humans out of work.


And we, here in the US, have, generally, done a very poor job facilitating this transition, with the humans whose skills were no longer required struggling to navigate their new reality, which has often had geopolitical ramifications.


But organizations that failed to capitalize on these waves of automation and their associated productivity gains failed to survive.


As painful as it is to recognize that your job is being replaced by a machine, the fact is that, if your company doesn’t replace you with a machine, the company that puts your company out of business will.


The same can be said for offshoring work. We complain bitterly about companies that offshore work at the expense of domestic jobs, then enjoy the lower prices offered by the companies that outcompete those companies that refuse to offshore or automate.


And, eventually, even the jobs that have been offshored are, subsequently, automated out of existence, enabling the companies that have capitalized on offshoring and automation to continue to offer better products and services at lower prices.


We should all bear this in mind when politicians glibly promise the impossible. Yes, it would be lovely to restore these jobs, but we are the ones that, by choosing the best products and services at the best prices, drove the replacement of those jobs through offshoring and automation.


There is a profound irony to watching consumers with a ‘buy American’ bumper sticker transfer imported products from their shopping cart to their car, which, even if it has an American car company badge, is likely to have been imported entirely, or, if not an import, is definitely comprised of a long list of imported parts.


The competitive advantages related to each wave of digitization are irresistible.


Organizations that fail to capitalize on these waves will not survive competition with organizations that take advantage of them.


Gravity sucks, but, if a rock is falling, you’d best step out of the way, because all the campaign promises in the world are not going to spare you a lot of pain if that rock hits you.


As stated above, these waves of digitization create a punctuated equilibrium, periods of relative stability interrupted by upheavals as organizational adoption of technological innovation digitize business processes.


It’s also worth noting that each upheaval follows a pattern.


As new technologies emerge, organizations must weigh when to get on board. Early applications tend to focus on applying the new technology as an incremental improvement.


To my mind, the best example of this would be smartphones.


There was a brief flurry of excitement as early devices provided the ability to access the Internet, but, as the novelty wore off, the tiny screen and traditional telephone keypad failed to offer a satisfying experience.


Incremental improvements were made. Tools were developed that allowed websites to tailor content for smaller screens. Keypads were improved.


But smartphones didn’t drive an upheaval until we shifted from trying to recreate a PC experience to offering a profoundly new experience through ‘apps’ on iOS and then Android devices.


We shifted from trying to do the same thing better to doing a better thing.


Another part of the pattern is the emergence of new companies and industries related to each wave. IBM and the mainframe. SAP and ERP systems. Cisco and networking. Intel and Apple and Microsoft and PCs. Amazon and Google and the internet. Meta and social media.


And it’s worth noting the companies that bet heavily on doing the same thing better but failed to do the better thing suffered as a result.


Digital Equipment Corporation insisted PCs would never be more than a better way to engage their minicomputers. Intel insisted smartphones would never displace PCs.


So, where does that leave us with respect to that social media post?


Why, if current economic policies are widely decried by economists, is the market soaring?


Because we are in the foothills of the next disruptive wave, and the investments related to that wave are driving up stock prices despite the economic fallout of tariffs and trade wars.


The disruptive enabling technologies are Artificial Intelligence, Internet of Things, Sensory Computing (not just ‘visual computing’ but other sensory modalities as well), and Augmented Reality.


Because we are in the foothills of this disruption, we’re focusing on doing the same things better.


Discussions of IoT, Sensory Computing, and Augmented Reality have been displaced by discussions of AI, but realizing the full potential of any of these capabilities relies on blending them.


We will see a lot of companies make expensive ill advised early bets based on trying to make incremental improvements to existing solutions, thereby missing the revolutionary potential of these capabilities.


We will see startups become behemoths as they offer better solutions fully capitalizing on these capabilities.


We will see at least one, if not multiple, market upheavals as bad bets lead to disillusionment followed by realization of what this upheaval can bring.


And we will see profound disruption in the workplace as yet another wave of human facilitated processes are digitized.


I am profoundly concerned that an electorate that doesn’t understand how these punctuated equilibria progress will miss the forest for the trees, and think the current administration’s economic policies are driving market performance rather then being obscured by the productive advancements of the next wave.

 
 
 
  • gpmoakley
  • Aug 16, 2025
  • 4 min read

As mentioned in my last blog, I had the pleasure and privilege of being interviewed by Don McCauley for “The Author’s Show” (you can find a link to the interview at https://www.georgemoakley.com/new-events if you scroll down to the bottom of the page).


And, as I’ve mentioned, many of the questions were quite thought provoking.


For example, he asked whether there was a central message to the book.


My response was that I read to be entertained, so I write to entertain.


I find ‘preachy’ novels tedious.


That said, my stories do reflect my values, and there are a number of tropes I avoid. Such tropes include the notion that the peril confronting our heroes is the result of ‘the scientists’ messing with something they shouldn’t and having the peril compounded by the naiveté or outright stupidity of key characters.


Believable, sympathetic characters will, like all of us, have their strengths and weaknesses, but there’s a difference between smart people working their way through a problem, learning along the way, and people taking cringingly stupid actions.


In my novels, the characters each of their respective areas of expertise. When confronted by a novel peril, they have to figure it out. They make mistakes, costly mistakes, but they’re understandable mistakes based on what they do and do not yet know when they make these mistakes.


And part of the fun, at least for me as a reader, is feeling like you’re part of the discussion as these experts struggle, together, to understand and address the threats they face, especially if this includes learning a thing or two along the way.


Dealing with the existential threats the Eden colonists face in ‘Kraken of Eden’ and ’Tides of Eden’ requires understanding the biology of the threats, applying the scientific method, and sound and practical engineering.


In particular, they have to apply, as discussed in another of these blogs, the part of the scientific method that deals with paradigms.


This all came to mind, recently, as I scrolled through a social media ‘debate’ regarding evolution.


There were, of course, participants that dogmatically reject evolution entirely.


But there were also quite a few lay people whose understanding of evolutionary theory was sufficiently superficial that, despite their best efforts, they could not effectively engage in the debate.


And, really, that shouldn’t be an issue. Most people really don’t need to be experts in evolutionary theory, any more than they need to be, say, experts in automotive transmission design or carpentry or quantum theory.


We each have our areas of expertise worthy of others' respect, and we each should respect the expertise of others.


But, these days, as exemplified in far too many debates, we are experiencing not only a denial, but an outright disdain for expertise.


Consider attending an entertainment event. Buying a ticket to see a movie, or a concert, or a sporting event.


The fact that you’re willing to spend money on such a ticket indicates that you recognize and value that other people have not only an innate talent but have also invested considerable time, energy, and resources to nurture that talent.


You certainly wouldn’t spend money on a ticket to watch me play tennis, I assure you.


Why is it that we’re willing to acknowledge, celebrate, and, as measured by ticket prices, value the expertise of entertainment and sports figures, yet far too many of us reject the expertise of scientists or economists or any of the other areas of expertise relevant to so many of the important issues we face?


For example, one of arguments made against evolutionary theory was the entropy argument. Why do we see increasing sophistication of life forms, such as the progression from single cells to humans, if there’s not a divine force driving this progression?


This argument presumes that there’s a progression, which is a popular misconception driven by the depictions we’ve all seen of a sequence of figures, typically left to right, starting with either a crouching ape or a fish crawling up a beach and ending with a dude in a suit or carrying briefcase or something along those lines.


Yes, especially in popular literature, we tend to depict evolution in hindsight. We start with the modern form, then depict, before it, its evolutionary predecessors.


But that’s NOT how evolution works.


Evolution is not a linear progression.


Evolution is a bush.


Life forms diversify randomly through their inherent variability and through mutations.


Sometimes, especially as environmental conditions change, these variabilities affect reproductive success, and some do not. At any given time, the individuals we see are the descendants of the individuals whose variability favored their reproductive success.


If you look at the history of life on our planet, sure, you see, AMONG MANY OTHER THINGS, an increasing sophistication of some life forms.


But such 'advances' are far, far outnumbered by the proliferation and diversification of the rest of the planet’s biology.


There are, for every advancement, a LOT of new bacteria, virus, and so forth.


Let’s look at this another way.


If you decided to flip a coin 10 times, the odds that you’ll get 10 heads are pretty poor.


But if you decided to flip a coin 10 times, and you decided to repeat that exercise 1,000 times, I guarantee you that, at least once, you’ll get 10 heads.


When we depict evolution in terms of the ancestral forms of a modern sophisticated species, we aren’t seeing divine intervention in our coin flipping.


We’re just ignoring all the coin flips that didn’t produce the kind of exceptional result that captures our attention.


People that have studied evolution understand this and could explain it, but, when we don’t respect expertise, we don’t listen.


And this IS important!


One of the reasons scientists find this frustrating is that evolutionary theory isn’t just a theory. It is a conceptual framework, a paradigm, within which biology makes sense. Evolutionary theory provides the context within which scientists can make and test useful predictions.


Want to improve crop yields? Want to detect and prevent, or at least respond to, a pandemic? Want to breed a new, really cute breed of dog or cat?


Evolutionary theory provides the foundation for that work.


So, yes, I read to be entertained, so I write to entertain.


But, in my novels, experts and their expertise are not only respected, they are essential to the plot…

 
 
 
bottom of page