Recent News & Articles
07/03/26
When systems become "Us": Engagement, Alignment, and Shared Meaning.
An interdisciplinary look at connection, from quantum correlations to conversational alignment
In physics, some of the most beautiful ideas are not about objects, but about relationships. Quantum entanglement is one of them. When two particles interact in the right way, they can end up sharing a single joint state. Later, even when separated by an enormous distance, measurements on one can be correlated with measurements on the other (Horodecki et al., 2009). Bell-type experiments have repeatedly observed these distinctive quantum correlations (Aspect et al., 1982; Hensen et al., 2015).
A simple way to picture this is with a paired state often written like this:
∣Ψ⟩ = (∣00⟩ + ∣11⟩) / √2
∣Ψ⟩ = quantum state; ∣00⟩, ∣11⟩ = correlated outcome pairs; √2 = normalisation
It is not the symbols that matter, but what they stand for: the pair is best described together, not as two independent stories running in parallel. Importantly, entanglement does not allow faster-than-light signalling, even though the correlations themselves can span large distances (Peres & Terno, 2004). And outside carefully controlled conditions, entanglement is typically fragile: interaction with the environment tends to wash out quantum coherence, a central point in decoherence theory (Zurek, 2003).
What does this have to do with linguistics? More than you might think. Let us shift from particles to people.
Linguistics also has its own kind of “joint state”, not in the quantum sense, but in the everyday mechanics of interaction. When we talk, we continually build common ground, the shared platform that lets meaning accumulate rather than restart from zero (Clark & Brennan, 1991). We also align our language as we go, often automatically, converging in word choice, syntax, rhythm, and framing (Pickering & Garrod, 2004). Sociolinguists have long noted accommodation, the subtle ways speakers adjust their speech towards one another in real time (Giles et al., 1973). Conversation analysts show how repair works, how we renegotiate meaning when something misfires, and how interaction stays resilient through tiny corrections and clarifications (Schegloff et al., 1977).
So interaction changes systems. In physics, contact can create correlations that persist beyond separation (Horodecki et al., 2009). In language, contact creates conventions and expectations that persist beyond the moment of speaking (Clark & Brennan, 1991; Pickering & Garrod, 2004). In both cases, the point is not a permanent tether, but a traceable pattern left by shared time and mutual adjustment.
Entanglement gives us correlation without message transfer (Peres & Terno, 2004). Linguistic alignment gives us understanding without conscious planning (Pickering & Garrod, 2004). Different sciences, different mechanisms, same lesson: when systems truly interact, they do not leave each other unchanged.
📖 Reference list:
Aspect, A., Dalibard, J., & Roger, G. (1982). Experimental test of Bell inequalities using time-varying analyzers. Physical Review Letters, 49(25), 1804 to 1807.Read more here
Clark, H. H., & Brennan, S. E. (1991). Grounding in communication. In L. B. Resnick, J. M. Levine, & S. D. Teasley (Eds.), Perspectives on socially shared cognition (pp. 127 to 149). American Psychological Association.
Read more here
Giles, H., Taylor, D. M., & Bourhis, R. (1973). Towards a theory of interpersonal accommodation through language: Some Canadian data. Language in Society, 2(2), 177 to 192.
Read more here
Hensen, B., Bernien, H., Dréau, A. E., Reiserer, A., Kalb, N., Blok, M. S., Ruitenberg, J., Vermeulen, R. F. L., Schouten, R. N., Abellán, C., Amaya, W., Pruneri, V., Mitchell, M. W., Markham, M., Twitchen, D. J., Elkouss, D., Wehner, S., Taminiau, T. H., & Hanson, R. (2015). Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres. Nature, 526(7575), 682 to 686.
Read more here
Horodecki, R., Horodecki, P., Horodecki, M., & Horodecki, K. (2009). Quantum entanglement. Reviews of Modern Physics, 81(2), 865 to 942.
Read more here
Peres, A., & Terno, D. R. (2004). Quantum information and relativity theory. Reviews of Modern Physics, 76(1), 93 to 123.
Read more here
Pickering, M. J., & Garrod, S. (2004). The interactive-alignment model: Developments and refinements. Behavioral and Brain Sciences, 27(2), 212 to 225.
Read more here
Schegloff, E. A., Jefferson, G., & Sacks, H. (1977). The preference for self-correction in the organization of repair in conversation. Language, 53(2), 361 to 382.
Read more here
Zurek, W. H. (2003). Decoherence, einselection, and the quantum origins of the classical. Reviews of Modern Physics, 75(3), 715 to 775.
Read more here
16/04/25
Flyting: When Poets Fought with Fire and Flattery
Across the colourful history of linguistic duels flyting stands tall as the medieval world’s answer to the modern rap battle, except with fewer beats and far grander insults. Originating in Scotland and Norse regions between the 15th and 16th centuries, flyting (from the Old English flītan, meaning "to quarrel") was a formalised ritual of poetic insult exchange, where two opponents would verbally duel using rhymed verse designed to embarrass, provoke, or outwit their rival, preferably all at once. But make no mistake, this was not drunken tavern banter. These battles were often held in royal courts and public gatherings, judged by nobles and even monarchs. The goal was grand: to demonstrate one's mastery of rhetoric, rhythm, and invective. Winning a flyting could boost a poet's social status, while losing could mean public disgrace, a devastating fate in honour-bound societies where words were considered weapons.
One of the most famous recorded flytings is The Flyting of Dunbar and Kennedy (c.1500), in which William Dunbar and Walter Kennedy, two court poets of King James IV of Scotland, launch a barrage of insults at each other. Take this remark from Walter Kennedy to William Dunbar:
“Ignorant elf, ape, owl irregular,
Skaldit skaitbird and commoun skamelar,
Wanfukkit funling that Natour maid ane yrle...”
(Translation: You ignorant elf, ape, misshapen owl, mange-ridden scavenger and scrounger, misbegotten foundling whom Nature made a dwarf.) Brutal. And it only escalates from there. Read more here
From a linguistic standpoint, flyting is a gem of performative pragmatics language not as description but as ritualised combat. Insults were delivered in structured metre, often rhymed couplets, demanding not just creativity but technical mastery. These exchanges reveal fascinating patterns in historical sociolinguistics, coded references to class, regional dialects (Kennedy often mocks Dunbar’s Lowland roots), and even early conceptions of gender and masculinity.
Remarkably, flyting also offered a sanctioned outlet for verbal aggression, functioning like a pressure valve in tightly knit honour cultures. And much like modern rap battles, it had its own social economy, win the crowd, win prestige. Lose, and, well… as Kennedy warned Dunbar:
“See sone thow mak my commissar amendis,
And let him lay sax leichis on thy lendis...”
(Translation: Have your deputy whip your backside six times or you’ll regret being born.)
In essence, flyting was the linguistic version of trial by combat, except with quills instead of swords, and a great deal more sass. So next time you hear a slick insult wrapped in rhyme, know that Scotland’s court poets were doing it centuries ago and probably better.
It’s also worth noting that women participated too, though less frequently. In Norse sagas, sharp-tongued female characters often engage in senna, a related form of poetic insult exchange with equal ferocity, proving that verbal sparring knew no gender bounds.
From a linguistic perspective, flyting is a fascinating example of performative pragmatics: language not simply conveying information but doing something, namely, challenging, defending, humiliating, and entertaining. Scholars note that the flyting tradition likely fulfilled several sociocultural functions: it maintained hierarchies, allowed for safe conflict resolution through verbal expression, and reinforced communal norms by showcasing what could or could not be said in public.
So, the next time you hear a modern diss track or a freestyle battle, remember: before Eminem, there were Dunbar and Kennedy, spitting rhymed venom in royal halls. Linguistic combat, it seems, is a tale as old as time, just with slightly more Middle Scots and fewer microphones.
06/09/24
Dear readers, as pleasures are greatest in anticipation, I would like to introduce my new novel which is to be published very soon.
Doubting Verity is very special to me for several reasons. Firstly, the novel raises the academic issues I, your humble servant, faced personally in my teaching career. Namely, the plot line is based on a true story that took place not so long ago and affected greatly the life of my friend and colleague.
So often, we hear about prosecutions where an offender is a person in a position of trust. Thus, we teach our society to see the matter in one way only and not to look at the other side of the coin when a senior person can be a victim. In doing so we frame our judgment and limit the social roles that may lead to a massive omission in the juvenile’s crime. Most of these situations evolve gradually, and both the adult offender and the juvenile one ensure that they mimic a normal courtship or natural affection. It’s often not until later that the minor realizes his or her naivety was exploited or a senior discovers they have underestimated the cunning and ingenuity of the underage. This book contemplates the consequence of the underestimation of the actions of a juvenile in academic settings. The plot follows the story of a young prominent university professor, Michael Elliot, of English Literature who runs the international summer program course.
Secondly, I named one of the leading characters after myself. If you have not yet managed to decipher my poem from the author’s tab, that name of no eccentric whim is Emma. Statistically, the name has always been on the top ten list of most popular names in the UK and Europe since, well, ever, inevitably at least one of my works was literally doomed to feature Emma. So now, when I am done with all those Isabelas, Corinnes and Ashleys, I have the right, haven’t I? Since Emma’s character in the book depicts my real-life role in the story the book tells, the character bears many of my personal features, works my job and has my friends. However, she has a less dramatic surname than I do (seriously, characters have enough troubles along the plot so I simply couldn’t torture them through surnames in addition).
Well, I hope this teaser is intriguing enough to boost your anticipation. The video teaser is also on the way and, aw, you will love it, I promise! I believe that is enough of coming out for today and remember, the world is full of magical things patiently waiting for us to discover them as soon as they get published.
04/01/22
Why "Losing head" is no longer an idiom?
Why does the ignorant one pulls through? The origin of the human brain belongs to the main mysteries of evolution and one of the most controversial topics in biological science. Why did evolution at some point in time chose to support brain development for one of the primate species? Why did the brain evolve so rapidly in such a short period? And why has homo sapiens' brains been constantly losing weight for 30,000 years?
To answer these questions, one will have to turn to the interesting metamorphoses that took place with the most favourable ancestors of mankind millions of years ago. Before the advent of Homo Sapiens, evolution took place in the traditional way. As we all know from our Anatomy and Biology school lessons, the "fuel" of evolution is polymorphism, variability within one species. If the external conditions of habitation did not change, the characteristics of the species remained more or less conservative, if the conditions underwent changes, then polymorphism was the only way for those creatures to survive. Adaptively of a gene turned out to be more suitable for the changed conditions and thus improved the species quality. When the variability of new genetic variations did not solve the necessity of adaptation to the new changed conditions, the population died out. Natural selection is the eternal opposition of a plurality of features and environmental pressure. The animals who managed to find food, survive cold and procreate successfully, lived. Others became extinct.
The frontal lobe, which became the morphological basis of human intelligence, originally had the task of inhibiting animal instincts. This structure is in control of our innate and automatic self-preserving behaviour patterns, which ensure our survival and that of our species. The first of our three brain functions inherited from ancestors is what scientists call the reptilian cortex. This brain sustains the elementary activities of animal survival such as respiratory system, adequate rest and a beating heart. We are not required to consciously “think” about these activities. The reptilian cortex also houses the “startle centre”, a mechanism that facilitates swift reactions to unexpected occurrences in our surroundings.
That panicked lurch you experience when a door slams shut, a looming silhouette in darkness, weird squeaking sounds somewhere in the house, or the heightened awareness you feel when a twig cracks in a nearby bush while out on an evening stroll are all examples of the reptilian cortex at work. When it comes to our interaction with others, the reptilian brain offers up only the most basic impulses: aggression, mating, and territorial defence. There is no great difference, in this sense, between a crocodile defending its spot along the river and a turf war between two urban gangs.
Only thanks to frontal lobe work are we able to restrain our instinctual will to grab the last piece of cake on the plate and offer it kindly to a child. The evolved frontal lobe dictates our ability to share, to refuse food and thereby maintain relationships within society. Now and then, we all hear stories about folks who are too concerned about losing weight and try to eat as little as possible, eventually developing a disease called anorexia. It is almost impossible to force a person with this disorder to eat, and modern medicine is powerless to help the matter. Interestingly, 60 years ago, when medicine was not so humane, patients with anorexia underwent a complex surgery that involved a sharp scalpel cutting off the frontal lobe in the lower part of the temporal region. After a while, these patients regained their appetite and returned to normal life. Oh, well, almost normal. Control over the animal instinct and its abuse were no longer in action and a thought of sharing food would never visit their heads again.
Another function of the frontal lobe was the support of social connections among the ancient hominids. Those who were unable to share food were either eaten, beaten, or expelled. Therefore, in just a few million years, the frontal lobe of the human brain grew very quickly and eventually became the basis of mind.
Man is a genuine part of nature and for a long time the evolution of the human brain followed the same biological patterns as it did with other primates. It didn’t go very fast, and the very appearance of primates about 65 million years ago cannot be considered some kind of pinnacle of evolution. It is nothing more than the adaptation of mammals to new environmental challenges. However, those were not the changes that could trigger the substantial development of the human brain. What were those unusual conditions that arose, that radically changed the nature of evolution?
To explain the reason for these revolutionary transformations, scientists wrangle over different forms of the so-called speech-social-labour theories. Some say that since our ancestors developed the art of communication their brain took a radical change, others claim that there are many species of animals known to use sophisticated communication systems and advanced community structures, but those have not led to the emergence of a large brain. So what happened?
Apparently, the archetype of the human brain was formed in a rather unique environment as the result of a long-lasting biological process. At some point in time, about 15 million years ago, very favourable conditions for the life of any mammals developed in eastern Africa. Then in the subtropics, in half-flooded places, in shallow flowing water bodies, some tasty and nutritious prey animals such as invertebrates or fish prospered in huge quantities. A no-less-significant group of predators led a fully satisfied life. Among the latter were our distant ancestors.
To picture the ease of a hunting process we may look at Norway today, where during the spawning of herring, bears come on their hind legs and, standing there up to their chest, scoop up caviar with their paws and eat it until they are full. Similarly, our ancestors just had to enter the water and draw lightly with their paws in order to gorge themselves. All this led to the formation of a group of species that practically dropped out of the selection system: why change if the environmental conditions are close to perfect? However, as known, with an excess of food, animals are not interested in anything at all except reproduction. The abundance of food thus increased competition during reproduction and, as a result, became the reason for the race for dominance.
One of the consequences of this condition was development of speech, which, apparently, originated in that period. Speech could have arisen as a way of organizing joint actions, and perhaps began with simple sounds or, for example, singing, like among modern gibbons. It was possible to impress the female with real success in hunting and abundant prey, which added attractiveness to the male, increasing the chances of passing on his genome to future generations. With an art of speech, a male could just tell a female creature about it and get the same laurels of the winner in her eyes, without making any real efforts. In the biological world, the principle of any interaction is based on the following: the fewer actions and the greater the biological result. Therefore, imitation of action with the help of speech has become an invaluable quality among archaic anthropoids. Speech became a profitable product and became a base of intense selection, as it allowed achieving a faster reproductive result. In fact, speech emerged as a form of deception, and deception was effective then and today.
To read more please request through the contact form
25/11/21
Antinomy Of Truth/Lie
Great but frightening Universe
Recently, I've had some interesting conversations with a few of my university colleagues regarding Science, as a field of study, representation in literature. There's a giddiness going around all social channels and YouTube blogs, related to an outpouring of science love, the kind you get from watching TV science shows, the kind that has stunning visuals, but is, well, a wee bit simplistic. Well, most of us won’t even understand what they are talking about, or perhaps would not bother understanding. It's all very positive, commendable, and perfectly reasonable. But it leaves me feeling a little sad.
You see, the thing is, it's relatively easy to focus on what we know, yet to me the wonder of the Universe, the awesomeness, is never greater than when we contemplate all that we don't know. And those are topics of so many YouTube videos, again with sparkling computerised effects and no substantial information.
It's true that when we take note of the impossibly tiny chip of time that our entire species has inhabited compared to the billions of years before, and the untold billions ahead, one can feel refreshingly small. Or, if we contemplate the billions of trillions of other worlds that must exist across the observable universe and septillions more across the Universe we can’t even observe, we can grasp momentarily at just how minuscule our daily existence is. But for me nothing compares to the perspective, the shock, or the excitement, of being reminded of what we are not even able to imagine in our primitive human brains. If you want a small shake of your grey cells, here you go, some of the questions I desperately pondered while working on Antinomy of truth/lie.
We don't know why the Universe exists:
This is really quite sad and somehow ridiculous, and could be grounds for doubting that the Universe knows what it’s doing and that there might be some exalted sacred reason for all of us. But in terms of Physics and Astrophysics in particular, although there are some really very appealing, very promising, theoretical frameworks that begin to answer the question, the simple truth is that we're not sure which might be right.
We don't know what dark matter and dark energy are:
Big problem for science indeed but a huge opportunity for SciFi writers to unleash their potential. Normal matter, the stuff of you, me, planets, stars, and Subway sandwiches, amounts to only about 4.9% of the total matter and energy content of the universe. 26.8% of matter is 'dark', we know it's there because on large cosmic scales stuff moves around faster than it should and because the way that galaxies spread themselves across space is consistent with the existence of vast amounts of slow-moving gravitating 'stuff' that never turns into stars or planets or anything, just stays as diffuse, invisible, incredibly antisocial particles.
We don't know whether life exists anywhere else:
This fact is not only close to my heart but also sends shivers down our spines when we hear something like, “absolutely alone”. Here we are, sacred beings on a planet blooming with life that's been busy sculpting and re-sculpting the physical and chemical environment for much of the past 5 billion years.
We probably haven't really figured out the quantum world:
While it's true that our present mathematical framework of quantum mechanics can do wonders, from describing atoms and molecules to the bizarre nature of entanglement and qubits, that doesn't mean that we've nailed the case shut. Quite the contrary.
We don't understand our own biology:
It's not too radical to say this, after all, if we did understand every detail of how we worked we'd presumably be able to eliminate disease, eliminate it at the stage of conceiving, and more than that, adjust the characteristics of a new being.
We don't know how the Earth works:
Let's head back to a grander scale. No human, or robot, has ever physically travelled deeper than 40K ft into the Earth's crust, everything else is extrapolation and interpolation from remote sensing and clever physical analyses, at times with the help of scanners.
We can't prove or solve many of our own mathematical conjectures and problems:
Ouch, the pain of childhood to so many of us. Once, when a reader asked me whether there was a book that ever made me cry, I said it was a Maths 3 Grade study book.
We don't know how to make an artificial intelligence:
I'm putting this here because it's an eternal problem along with a group of other concomitant issues it drags along: the morality of AI self-awareness, self-consciousness and, my favourite, how to teach an emotionless machine to love us, primitive biological beings, when our existence resembles life of vermin too much.
The verdict:
There's an awful lot we don't know, far more than just my examples here. But the point is not to get discouraged, because this ignorance is a wonderful thing. It's what drives science, and it's what makes the Universe truly awe-inspiring. So, if any of these questions bother you too, welcome to the world of Antinomy.
14/05/20
E M M A ?
In today's article, I would like to talk about the fresh adaptation of the cult Jane Austen novel Emma. Unfortunately, I will not be able to provide a comparative analysis with previous adaptations; perhaps I will do this later in a separate article, because, well, I want to say a lot about the version of 2020. I do not intend to offend anyone by my subjective opinion. I am writing the following just because my heart is aching with sadness as Jane Austen, one of my favourite writers, deserves a more serious treatment in my opinion.
De Wilde, who made a name for herself as a rock photographer by shooting such stars as the Rolling Stones for publications including the New York Times, says she sought to bring the rock star spirit to the characters in her adaptation of Jane Austen's famous novel Emma.
Ms Wilde and screenwriter Eleanor Catton, known for being the youngest person to win the Booker Prize for The Luminaries, tried not to think about previous versions of Emma or the book itself as they worked on the script. According to de Wilde, she wanted to bring out completely different aspects of Austen’s work to the audience than previous adaptations bore. The main focus for her was the comedic aspect of the novel.
This film will really take you some time from the very beginning to calm down from the first shock and continue to watch it with a slight condescension to the complete disregard for the strict English society rules, the complete mismatch of the props, and of course the hairstyles of some of the characters.
There is also something I could identify as some startling buttock action. A despondent Mr Knightley is seen completely naked from the rear right at the beginning. My favourite moment is his close-upped attempt to tuck the shirt into his trousers without exposing his main asset to the big screen.
Luckily, these indiscretions happen at the very beginning, after which the movie keeps its full period costume sedately in place and looks more or less decent, except for the strange panic attacks of Mr Knightley and the erotic swallowing of strawberries by Emma.
Taylor-Joy plays an unfamiliar Emma, not the one you were thinking of reading Jane Austen. Back in 1816 at the time of the first publication of her novel, Austen said, "I think Emma is unlikely to be liked by anyone but me." I do not think even she would have liked this Emma.
Anya is particularly good at the legendary unpleasant moments, outbursts of arrogance, annoyance, malice and sadism in which Taylor-Joy clearly resembles the evil rich kid she played in Cory Finley's recent thriller Thoroughbreds.
Sometimes the casting and production work well, sometimes not so well. The excellent actor Josh O'Connor is forced to play the pantomime role of Mr Elton, which is not really suitable for him. Maybe he would have brought something more interesting to the role of Frank Churchill.
Johnny Flynn is dubiously masculine as Mr Knightley, with prickly morals that conflict with something sensual. He shows no interest in Emma until he dances with her at one of the parties. And there, my dear friends, we can safely characterise the moment as: "He was suddenly hit by love."
However, the real revelation for me was Mia Goth as Harriet, a gawky, maladroit yet engaging and touching portrayal of a lonely and rather scared young woman who looks as if she has been crying herself to sleep. I applaud her.
Enhancing the foppish mannerisms of Elton and other characters, de Wilde also tried to make fun of the rigid class system, which Austen did with clever verbal duels, puzzles, and comic sketches in her work. The director chose a different path, a simpler one.
Near the end of the film, in a moment that could have been pulled from a raunchy teen comedy, Emma and Knightley seem finally to be about to kiss, but she gets a nosebleed. De Wilde says that this scene took inspiration from her own experience of getting nosebleeds at the wrong time.
In general, this title is the only thing that features period in the entire work, despite the excellent work of the screenwriter in composing dialogues close to the book and the excellent work of the film crew, who chose interesting angles and, in principle, the work is bursting with dynamics.
I like the earlier film adaptations of 1996 and 2009, but it is a matter of taste, of course. Perhaps, for those who have not read the book, the film will look like a good comedy based on some classic novel.
We will wait for the next adaptation of Pride and Prejudice which has already been announced for release this year.
05/04/20
Books that will kill you
I am sure that you’ve heard about dangerous books that can negatively affect the reader. I believe the first association that flashes across your thoughts is Death Note, the famous manga by Tsugumi Ohba. Truly, the majority of such books that are said can cause death, we saw only in horror films or in popular anime.
And yet, in Uncle Sam's country, there does indeed exist a 19th-century book called Shadows from the Walls of Death: Facts and Conclusions that will kill anyone who touches it.
The truth about this deadly book that kills its readers scares even the most daring fans of the Death Note manga.
However, the book I am telling you about right now is a very real hard copy, four of them to be precise. Originally, it was printed in an amount of 100 copies, but almost all of them were destroyed later on as the deadly quality of the manuscript was revealed.
I believe you have already rightfully guessed that this work is not available to the general public, given the danger that it entails. All four copies of it are located at Michigan State University in Bethesda, Maryland.
To find out the history of the creation of this literary weapon, I take you back to the year 1874. It was published by Dr Robert M. Kedzie (1823 to 1902), who was a surgeon during the Civil War and later became a professor of Chemistry.
Well, Dr Kedzie, unlike Alfred, decided to try himself in literature. In fact, Shadows from the Walls of Death contains 100 pages of wallpaper samples with descriptions, 86 of which are deadly because they are soaked in arsenic.
Dr Kedzie was one of the first to discover that the poison spreads through the air and significantly increases the number of diseases and deaths in the country. Therefore, in order to draw attention to toxic wallpaper, he decided to print this book.
After making 100 copies, Doctor Kedzie felt not really well despite the fact he took some measures to protect himself. Nevertheless, the desire to deliver the message to the public was too strong and he donated those killing books to public libraries absolutely free of charge.
It must be said that the handling and storage of the remaining four copies of this deadly book were carried out with great care. At the initial stage, working with copies of the book required the wearing of special gloves, masks and other protective equipment.
Another book on my list of the deadly ones is Fahrenheit 451 by Ray Bradbury. Do not rush to throw away your copy. To spot the right one we have to go all the way with the symbolism and pick up a copy of the 1953 limited edition bound in asbestos to prevent it from burning.
And finally, here we are at the final but probably the most killing book on the planet. We all must remember those Chemistry classes when our teachers shocked us with Marie Curie’s eccentric fascination over radium and polonium to the extent that she did carry radioactive material around with her all the time.
This led to the ultimate conclusion: reading Marie Curie’s research notes can kill.
22/01/20
“Antinomy of truth”
The first book of the Antinomy book series.
Damien, a young smart French research fellow, presents his findings that are too revolutionary for the current understanding of the laws of physics and mathematics. His works on time form and matter as well as his diploma thesis on the role of zero vibrations in quantum technologies open him a way to a dream job and world recognition. Little did he know that in a four hundred year period his works would cause an internecine confrontation between races of our Universe. Why would he care? Will he ever be able to change this future?
16/11/14
“More haste, less speed”
The second book of the new Fantasy book series “The order of supreme power” is on Anabelle’s desk now. This one will focus our attention on another magnificent Order warrior, the youngest of them all, Avite. Oh, the poor man has no clue of what awaits him when he is heading to fulfil his duties. Rush always ends up in quality loss. Or does it not?
Popular
NO BRAINER, COULD YA COME AGAIN? - ALLIANCE OF MAGIC WORLDS
For a true daredevil Corinne, who always chose fun over boring history lessons...
NIGHT DAWN - DIABOLIC EDEN
British Museum historian and scientist Isabella is summoned to a national research team to analyse cryptic symbols seared into the ancient folio...