About
This is the first post of a set of two posts containing discursives I wrote as practice for the High School Certificate (HSC) Advanced English module ‘The Craft of Writing’ in 2024-2025. Hopefully I said some interesting things.
Ontogenic Recontextualisation of the Imaginary via Pseudo-Intellectual Semiotic Inscription
Stimulus: “There are some people who leave impressions not so lasting as the imprint of an oar upon the water.”
Let’s be honest. Neither of us have any idea what that means. Yet we see phrases like this whenever we crumple onto our beds late on a Friday evening, doomscrolling JSTOR for the latest academic commentary on the Oedipean undertones permeating the thirty-second episode of Octonauts. And, personally, I start to get a headache, and then I begin to realise for about the fifth time this week that I really should get new hobbies.
But I’m sure most of us who have been exposed to the prosperous world of academia probably has, at some point, read something, or heard something, and thought to themselves:
”???”
And maybe you sit there for at least ten minutes, re-reading that same sentence over and over. Maybe you leave the conversation pondering what they meant by “the signified qua ontic abstraction is interdependent on noumenal expression”. Maybe it’s not a sentence. Maybe it’s just a block of abstract mathematical symbols, Greek letters, the letter ‘L’ in three different fonts to distinguish between the Laplacian and the Lagrangian and the Lebesgue measure.
Entertain the thought that you somehow decipher this modern Rosetta Stone. What does it mean?
I’ll tell you. Nine times out of ten, it’s astoundingly comprehensible. Perhaps it reduces to:
“A thing needs to exist to have meaning.”
Or:
“North poles must be paired with a south pole.”
Humans love being looked up to. They love feeling indispensable, unique, an irreplaceable screw in a greater machine. And our advancements in digital globalisation haven’t made it better. Now, more than ever, people are separated by accessibility.
We read the thoughts of far more intelligent people than us daily. And instead of feeling connected to the greater corpus of research, of knowledge, we feel…
Irrelevant? At the very least, I get the impression that I’m sitting at the bottom rung of the Great Ladder of Significance. If all these people can come up with these great ideas, how can I ever find my footing? Or will I drop off the ladder into the abyss of irrelevance?
The answer to combat this existential well is, for some, regrettably simple.
Perhaps instead of “x is prime”, they replace it with “$\nexists k \in \mathbb N : (1<k<x) \wedge(k|x)$”. Perhaps instead of “social paradigms”, they replace it with “public idealisms of ontic metanarratives”. Every new impermeable phrase ripples the water. Every new serving of semiotic spaghetti satiates their own self-importance. That the plebian must beg for enlightenment – “what does it mean?” – the very thought only propels them up the imaginary rungs.
What I find (admittedly cynically) hilarious about this, is its superficiality. The ephemeral illusion, the trick-of-the-light, the shimmering caustic, is shattered when you treat them not as parrots, but as people to engage in a discussion with.
And I guess it makes sense. A reductive understanding of theory leads naturally to its shallow application. The late modern physicist Richard Feynman has been attributed to saying, “If you think you understand quantum mechanics, then you don’t.” Yet people will still claim they find the Feynman path integral totally intuitive. People will still throw about Hegelian dialectics and Hamiltonian dielectrics like party tricks.
I’m not saying that all academics are like this. Plato would have had to rigorously redefine perception and reality to arrive at his World of Forms. Post-structuralist Jacques Lacan re-categorised existence itself over half his lifetime so that he could rigorously talk about floating signifiers and intersubjectivity and whatever else baffles the high school English student. But the vast majority of wannabe-academics today use words as aesthetic noise, not as crystallisations of ideas.
Language exists to communicate ideas. Not to obfuscate them. It’s why there are communities out there trying to explain complex ideas to people “like they’re five”. It’s why in 2001 the internet hoisted up Simple Wikipedia, an initiative aiming to simplify the expansive web of hyperlinks directing towards prerequisite knowledge of the standard Wikipedia.
Our modern fetishisation of intellectual alienation means, to an extent, we all focus too much on appearances. Surface level impressions of intellect are shallow – transient. Perhaps to truly show that you understand something, you need to be able to dumb it down. To lower the barriers of complex jargon and “it’s trivial, really”. Not to get caught in the whirlpool of affectation.
To stir the semiotic sea for something deeper.
Cthulhu and Calamari Rings
Stimulus: “It is terrifying and magnificent, familiar and utterly alien.”
When H.P. Lovecraft wrote a short story in 1928 about an octopus marginally larger the ones you can buy from your local fish market, the masses were enthralled.
Because here was the promise of something so abstract, so incomprehensible by sane minds, that it threw readers into a sort of cosmic suspension. The Call of Cthulhu wasn’t interesting because the final sentence was “and they cut him up and made calamari rings and order was restored for the next millennia”, but because it didn’t offer any satisfying resolution or system of logic.
It wasn’t horror because we knew that it was coming for us. It was horror because we didn’t.
We live in an age obsessed with organisation, clarity, understanding. We create to-do lists and colour-coded calendars, our schools demand us note down deadlines, we algorithmically analyse stochastic trends. We mandate students to “show all working”, and scientists to package the universe into one, singular, “Unified Theory”.
The modern world works until everything is distilled down to a spreadsheet, or some abstract mathematical model.
But why?
Sure, we love consistency. Personally, I love the feeling of understanding, the ability to resolve tangled problems, unintuitive paradoxes. Why else would people spend their time solving riddles? They’re contrived problems designed to give us the pleasure of an organised, rational solution to a seemingly contradictory dilemma.
But there’s a reason why white-collar nine-to-five jobs are considered dull. Their quotidian nature detracts from intellectual stimulation. In a sense, perhaps they are too consistent.
We are naturally repulsed by the aspects of life which don’t fit the rubric. Exempli gratia, I abhor chemistry. I spend lesson after lesson wondering why secondary amides have anomalously low boiling points, or why Van de Waals forces scale on electron density, or why SN2 reactions even occur. And the answer I often get, is:
“It’s empirically measured; we don’t know why.”
People will accept a bad explanation over no explanation. That electrons somehow distribute over conjugated pi-systems. But ultimately, what we’re fed is just a bunch of words invented by a bunch of theorists sitting in a room attempting to justify a bunch of observations. And our tendency to rationalise makes sense. Incomprehension is discomforting.
Perhaps it should be.
There are some things that we might never understand. Quantum mechanics, one of the most successful theories of everything, essentially describes a world governed by probabilities – one where it’s impossible to know everything, to plot all data, to accurately measure anything at all. And even our own minds house contradictory enigmas. Who can explain the narratives of dreams, the intersection of déjà vu and jamais vu, the semantic satiation that occurs when you read a word a few too many times, and start seeing it as alien glyphs rather than a familiar phrase?
When logic dissipates, we’re forced to confront the boundaries of our epistemological frameworks. That we are trying to map out infinity with finite knowledge, finite time, finite language.
The recognition of contradiction invites play. It invites a teetering existence on the brink of comprehension – terrifying, but enlightening.
The modern world spawns more Cthulhus than we can understand. But perhaps we don’t need to.
Perhaps it’s enough to simply experience.
And on the third day, Jesus slept for just a bit longer
I’m empirically proven to be awful at a lot of things. But the art of doing absolutely nothing is definitely not one of them. In fact, my proficiency for procrastination is so high that as of thirty seconds ago I was in the kitchen, cutting up persimmons for no reason other than to avoid planning this discursive.
But I can’t say it doesn’t bother me.
It’s not just that I’m three weeks overdue on piles of essays, or that all my progress journals have their most recent entry on the 18th of February. It’s that I want to be productive. I want to wake up one day, sit down at my desk, and start thinning the ever-growing to-do list. I want to finish the worksheet that was collected two terms ago. I want to open a blank document, and not immediately redelegate my consciousness to Big YouTube.
A quick internet search floods the results page with innumerable self-help articles – “how to fix procrastination”, “top 10 six methods to avoid procrastination”; and the one which draws my attention the most: “You’re curable; the second-best time is now.”
Society tells me I can be fixed, and I want somebody to fix me. A perfect combination.
Right?
Well, put it this way. I would not be surprised if the top piece of advice was:
“Change your environment. You’re trying to work against distractions.”
Today’s society seems to be increasingly convergent on this type of ‘redemption’; an exoneration of those at fault. It seems to me that contemporary redemption narratives are underscored by a removal of accountability. When you fall into a bad habit, it’s nice to know that “you’re not the only one”. When you fail an exam, it’s nice to know that “there’s always luck involved”. When I prioritise instant gratification over finishing this sentence, it’s nice to know that it’s due to the “stifling academic-industrial complex”, rather than my own stifling academic-industrial incompetence.
The language of redemption acts like a shield, if you think about it. It’s a great way of saying “the first step is acknowledging it’s easy to fix yourself, because it’s not really your problem”. But if we peel back the veneer, the words ring hollow.
I don’t think that chronic gamblers rejoin the table with a singular kidney because they’re not rich enough. I don’t think that the titular ‘crypto-bro’ lost half his savings overnight because he shattered some mirror. I mean, sure – not being broke and having better luck might save them in the short term. But by recasting every moral failing as victimhood, we dilute the idea of real, personal responsibility.
Still, it’s probably equally naïve to claim that every issue arises from within. Who can honestly say that those withering away from cancer should crawl out of their hospital bed and find the cure for themselves? But this modern tabula rasa outlook on redemption seems to be advocating for passivity. That there’s nothing inherently wrong with people who guzzle on the mashed remains of those they’ve brutally massacred. That they just need a cozy safe space to get away from the itsy-bitsy spiders of society. That some deus ex machina will jump out and glaze their vision with a sea of ponies and rainbow fairies.
I might be too cynical. Maybe everybody really can be fully redeemed by some external panacea. But a literal incarnation of God couldn’t even purge a measly five billion people of sin. Needless to say, it’s a tall order for our current population of eight billion to be perfected into paragons of virtue by reading self-help articles written by their own, imperfect kind, telling them to look elsewhere for solutions.
It’s Jesus’ turn to sleep in.
And I probably should get out of bed and write my essay or something.
An open-casket communal coffin
When my Latin class assigns re-reading the prescribed text for homework, every student sighs with contentment.
It’s not because we love reading lines upon lines of the exact same Latin we’ve already read in class. It’s not because we secretly enjoy the viscerally engaging plot of a dead Italian wandering around the Mediterranean for three hundred lines.
It’s probably because he’s essentially bestowed upon us an afternoon of respite. One where we don’t actually have to read the prescription, because “he won’t check it anyway”. In a sense, there’s an unsaid precept that nobody will be picking up their Latin books that night. There’s a sort of shared understanding that they’re going to be unceremoniously chucked onto the bedroom floor.
But I think there’s a veiled irony behind exactly why we’re able to get away with it. Why he doesn’t check. I mean sure, perhaps it’s just too hard to differentiate a student who’s reread the text outside of class from the sane remainder. But perhaps it’s that he also believes in a ‘shared understanding’ – that as dedicated year 12 students enraptured by the artistic beauty of the classics, we would be positively enthralled to reflect upon our texts in our spare time.
And this feels like where we stumble upon the danger of this unspoken ‘shared’ understanding. That, paradoxically, by never confirming our views, we can never definitively arrive at a mutual agreement. That by insubstantially labelling an opinion as “shared”, we slam the coffin lid over the possibility for actual sharing.
This presents itself as strange to me, because our modern world seems to be obsessed with verification. White-collar workers are expected to “clock-in” and “clock-out”. We run student work through plagiarism detectors, and more recently, artificial intelligence detectors. Even now, I’m attempting to maintain two portfolios detailing and tracking my project progress.
And more than that – our social-media-percolated ethos seems to fetishise sharing if anything. Muscular men posting their workout routines. Pretty girls sharing their makeup recommendations. Twink femboys oiling up in cat-maid outfits. Hell, they’re probably disclosing a bit too much.
So, in such a society, one where constant verification is endorsed, and shared content is served in mountain-loads on silver platters, how is it that we are unable to peek into the shadows of this fabled common understanding? How is it that we continue to hide the controversies, skirt discussion? That we continue to bury all our strained relationships and countercultural opinions two meters below the ground in a wooden casket – a casket marked with the chipped gravestone of unverified assumptions and “I’m sure everyone understands”?
I mean, shared understanding is helpful. And I can’t say it isn’t convenient. For the conmen who fooled Andersen’s emperor, it was especially satisfying; it’s easy to profit upon the induction of some mutual agreement – in this case, that the ‘new clothes’ they weaved were visible to all.
But, as a slightly more ethical exegesis of the fable reveals, our modern concept of “shared understanding” is founded on assumptions. The pretence of objectively recognising what are fundamentally subjective truths.
It’s perhaps not too surprising that the social media accounts I used as examples of sharing a few moments ago, are perhaps not so great examples of active understanding. We don’t really believe that their bubbly outward personas align with their private selves. But we’ll like it. Because the comfort of our self-deceit – that they truly are what they share – is more desirable than staring upon a pallid corpse in an open coffin.
We won’t validate our shared understanding.
Because it’s scary. Because poking at the web of assumptions that thread the catacombs and weave the pillars of our society is dangerous. We could never challenge the idea that slavery has been successfully abolished. That we are a diverse, multicultural country, without an ounce of racism in our parliament. That we’ve successfully closed the gender pay gap.
We ‘understand’ all this. But I think perhaps we really need to understand that, paradoxically, by never confirming our views, we disconnect ourselves from true shared understanding. Not just a euphemism for “senseless assumption”. That we need to actually check up on the bigger things.
Doesn’t mean I won’t exploit our communal coffin a bit longer though. Tell my Latin teacher that Virgil can wait.
On Simplicity
I’m typing away, hunched over my laptop. My room is dark, save for the flickering light of my bedside clock and the phantasmal glow of my dimmed screen.
It’s 2:03 AM. My research project is two hours and four minutes overdue.
And I remember the beginnings of a distant rumbling, slowly crescendoing over the dense pattering of rain against my window. I remember the flash which briefly imbued my curtains with a ghostly phosphorescence; a strange, ethereal radiance – just for that split second.
And I remember that poem that shimmered under the glistening sheen of the raindrops outside, and probably my own tears.
“The apparition of these faces in the crowd. Petals on a wet, black bough.”
Of course I remember it. It’s not because I ended up submitting 3000 words of detailed analysis on two bloody lines of poetry four hours later, before hurriedly rushing out my door to catch the bus to school. It’s because Ezra Pound’s “In a Station of the Metro” is just so simple.
Simplicity is desirable. We love when problems are simple. We love when they have defined boundaries. Rigid constraints. Straightforward, clear-cut, motivated solutions.
I’ve dabbled in my fair share of programming problems. And what pains me more than to see the shortest solution take well over three hundred lines of the messiest case-bash in existence? Or maybe some weak test-data that has me rigorously spending hours upon hours debugging, just trusting that my old code which passes over a hundred test-cases is correct? (hint: it’s not) Likewise, what pleases me more than a three-line implementation?
For complexity to overtake simplicity – to be more “beautiful” – I would have to see something incredible. Some sort of reality-inverting, time-reversing higgledy-piggledy witchcraft from good old complexity. Something which only complexity could bring, that would be able to challenge all the hours saved, all the elegance, and all the ease of simplicity.
So, imagine my astonishment when I learnt that there was no secret sauce. No caveat.
And maybe that’s what gives it beauty. We can’t live our lives trapped in this bubble of contrived simplifications. I mean, we’ll strive to do it. Our science syllabi are packed chock-full of abstractions, dumbed-down models, abridged definitions – the very basics. None of that messy calculus stuff. None of the differential equations with no known solutions. Here, in physics, we explain things qualitatively!
I’m not saying it’s bad at all – there are students who just get tangled up in huge messes of atomic orbitals and organic reactions. It would be unreasonable – counterproductive even – to collapse the teetering tower they’ve spent so long building with a: “hey, remember that atomic orbital stuff? It’s all fake, we use molecular orbitals now”.
But I think sometimes we need this complexity. Sometimes, we need more accurate models of our universe to answer the messy problems – messy problems that just can’t be ignored. Sometimes, we ought to teach the students who are just a bit too comfortable about numerically approximating quantitative solutions, or about orbital hybridisation, or about Freud’s Eros-Thanatos complex (but actually properly this time).
The more complicated something is, the harder it is to understand. The more we suffer trying to study it. The more we struggle.
But crucially – the more complicated something is, the more we grow by engaging with it head on.
My past self might have subconsciously known that as he finished the concluding paragraph of an essay well over the word limit. A complicated essay intricately picking through the simplest words, arranged in the simplest way, about the simplest ideas. And of course I probably got 40% on that task as well as an angry email from a worn-out English teacher asking me to redo it.
But the beauty of that essay might just have been anything but its simplicity.