top of page

Search Results

114 items found for ""

  • Can we build the Iron Man suit? | OmniSci Magazine

    Cinema to Reality Can We Build the Iron Man Suit? By Manthila Ranatunga We see cool and fancy gadgets in movies every now and then. How can we bring them to reality? For this issue, we take a look at the Iron Man suit. Edited by Breana Galea, Ashleigh Hallinan & Tanya Kovacevic Issue 1: September 24, 2021 Illustration by Gemma Van der Hurk Warning: Iron Man (2008) spoilers When Marvel Studios released Iron Man in 2008, it was all the rage among comic book fans, film geeks and engineers alike. The Iron Man suit is one of the coolest and most iconic gadgets in film history. A generation of mechatronics engineers were inspired after watching Tony Stark build the suit, myself included. Now we wonder whether we could build it with today’s technology. So, the question remains: can we build the Iron Man suit? We are talking about the Mark III suit, the gold and hot-rod red one. Unfortunately, replicating the suit is impossible; the laws of physics would not allow it. However, we can make some compromises and find some workarounds to build the suit’s most defining systems. The Power Source We can all agree the most vital part of the suit is the power source. After all, it gave Mr Stark the idea for the suit. The suit is powered by an arc reactor, which is essentially a fusion reactor (1). These produce power using nuclear fusion, the same way the sun and stars burn as enormous balls of fire. We are talking about reactions between atoms which are the building blocks of everything. Atoms contain a cluster of even smaller particles inside. Collectively they form the nucleus, so you can see where nuclear fusion comes from. Now, where are we going with this? Well, when nuclear fusion occurs, heat energy is produced (2). Nuclear fusion was chosen as the suit’s power source due to the colossal amount of energy it produces. With the palm of your hand acting as a size guide, nuclear fusion is one of the highest energy density methods available. Sounds too good to be true, right? Correct. To replicate the conditions required, a reactor would need to be heated to 150 million degrees Celsius (3) - 10 times hotter than the sun’s core! Imagine that on your chest! Unsettling, to say the least. Mr Stark’s arc reactor is self-sustaining and can power the suit for hours, or even days. But with modern technology, fusion reactors consume more energy than they produce (4). Consequently, recreating an arc reactor of the same size and energy output is currently impossible. Nevertheless, there are workarounds to create a partially functioning arc reactor. Massachusetts Institute of Technology (MIT) has been working on a fusion reactor called the ‘Alcator C-Mod’ for the past 20 years (5). Their goal has been to reduce their size while maintaining power output. Typical fusion reactor size ranges from three to nine metres in diameter, but MIT has managed to reduce theirs to about one. Assuming fusion reactors are net-positive energy producing and well heat-insulated, we can assemble the Alcator C-Mod into our own arc reactor. There are many more factors that are too complicated for us and thus we will ignore them. Instead of being placed on the chest, it can be a giant backpack! The Flight System Now, why do we need so much power? Well, the flight system consumes the bulk of it, which leads to the next point. In the movie, Iron Man flies using the repulsors on his gloves and boots. They are not gas turbines like jet engines. The suit does not carry fuel – how could it? It does not have any storage compartments. The fuel must come from outside of the suit. Here is a hint: it is everywhere, yet invisible at the same time... Air! Helicopters fly by pushing air downwards with their rotors. This works according to Isaac Newton’s third law, which states that any force will have an equal and opposite reaction. By pushing air downwards, the helicopter goes upwards. Iron Man does not have a giant rotor, so how did he solve this? Get ready for another round of physics! Repulsors use muon beams to control flight as needed. Muons are particles smaller than atoms. They exist in the Earth’s upper atmosphere (6), but can also be created at large research facilities. For now, let us assume Mr Stark has a way to produce them on his own; remember, he is a billionaire! The muon beams are ignited using plasma made by the heating of air. To produce this on-demand, the suit draws power from the arc reactor for heating and the suction of air. The repulsor beams are then created, ready for flight! Muons have a short lifespan - about a millionth of a second. In real life, muon storage is not a viable option; they must be generated on the spot. Muon creation occurs in particle accelerators (7). These are long tubes for accelerating and making particles collide at high speeds. You may have heard of the Large Hadron Collider in Switzerland, a particle accelerator that is 27km long. Through efforts to miniaturise them, researchers at the SLAC National Accelerator Laboratory have designed one only 30 centimetres in size (8). Ignoring some laws of physics and with a few billion dollars, we can fabricate this into our own repulsors. Keep in mind - the suit’s hands and feet are smaller than 30 centimeters. Our gloves and boots will be longer and bulkier. The Future So there we have it - a semi-reasonable arc reactor and a flight system. Fun to explore the possibilities of current technology, right? But we must also consider the ethics of building such a deadly weapon. Yes - the Iron Man suit is a weapon. In the wrong hands, this technology would not be so exciting. Centuries or even decades from now, scientific breakthroughs may allow the replication of the suit. When that happens, as humans, it will be necessary to contemplate the moral consequences of such an advancement. Here we have only examined two principal systems of the suit. The rest is up to you! Traverse your mind and create your own semi-realistic Iron Man suit. As we saw here, the Iron Man suit is not far off from our time. Who knows what the future holds? References 1, 3, 4. Trevor English, “How Does Iron Man's Arc Reactor Work?” Interesting Engineering. Published June 26, 2020. https://interestingengineering.com/how-does-iron-mans-arc-reactor-work . 2. Matthew Lanctot, “DOE Explains...Nuclear Fusion Reactions.” U.S. Department of Energy. Accessed August 30, 2021. https://www.energy.gov/science/doe-explainsnuclear-fusion-reactions . 5. Earl Marmar, “Alcator C-Mod tokamak”. Plasma Science and Fusion Center - Massachusetts Institute of Technology. Accessed August 31, 2021. https://www.psfc.mit.edu/research/topics/alcator-c-mod-tokamak 6. Paul Kyberd, “How a ‘muon accelerator’ could unravel some of the universe’s greatest mysteries”. The Conversation. Published February 20, 2020. https://theconversation.com/how-a-muon-accelerator-could-unravel-some-of-the-universes-greatest-mysteries-131415 . 7. Seiichi Yamamoto, “First images of muon beams”. EurekAlert! Published February 3, 2021. https://www.eurekalert.org/news-releases/836969 . 8. Tibi Puiu, “Particle accelerator only 30cm in size is hundred times faster than LHC”. ZME Science. Published November 6, 2014. https://www.zmescience.com/science/physics/particle-accelerator-faster-lhc-5334/ .

  • Sick of lockdown? Let science explain... | OmniSci Magazine

    Sick of lockdown? Let science explain why. By Tanya Kovacevic Feeling like the ant under COVID’s boot? Find out just why you are feeling so down, and how you can break free of the overflow of emotions. Edited by Sam Williams Issue 1: September 24, 2021 Illustration by Quynh Anh Nguyen Trigger warning: This article mentions symptoms of mental illness. If at any point the content is distressing, please contact any of the support services listed at the end of the article. COVID-19: the greatest enemy of 2020 and 2021. Victoria has had six lockdowns in the hopes of disrupting the course of the virus, leaving many feeling tired and hopeless. The endless restrictions have tested our resilience beyond belief. As a result, many of us are sick of lockdown: we are tired, moody, and anxious, following months on end of being secluded in our homes. It seems we have all turned into little Snorlaxes. If this is sounding uncomfortably familiar, you are not alone. Psychologists have realised it is a common occurrence amongst many Australians. So why are our little octopus plushies showing their angry little faces? What can we do about it? Illustration by Quynh Anh Nguyen Cue the entrance of ‘lockdown fatigue’: the psychological phenomenon describing a wide-reaching feeling of intense exhaustion, due to the long-term effects of COVID-19 (1). Speaking to your fellow students (and lecturers/staff), you might find that a common theme of working from home is too much time binging on Netflix. In other words, there is a shared lack of motivation and concentration. The Australian Psychological Society has likened these symptoms to the natural process of grieving – yes, you read that right: we are all grieving. The world that we once knew has been completely disrupted, with our daily freedoms and safety torn away from us. Lockdowns have introduced so many unfamiliar aspects into our lives, from regular tests to social distancing to travel restrictions. Where we once had freedom to go to concerts or the footy, or to lie in the sand with the sun on our faces in Torquay, we are now confined within our own boring four walls. Combine this with missing our friends and family, worrying about the future, and inconsistent messages from politicians, it is no surprise that we are currently witnessing a lockdown fatigue epidemic. Identifying lockdown fatigue can be extremely difficult, as most of the symptoms overlap with common mental illnesses, such as depression and anxiety (2). Racing thoughts and conflict with those close to you are early signs (3). A study of 243 Filipino students showed that headaches and body pain were also common amongst students attempting to balance the effects of lockdown with their education (4). The most frequent symptoms are perhaps the most observable: depressed mood, irritability, fear or anxiety about how this will all end, lack of motivation and/or concentration, inability to make choices, and, of course, feeling mentally and physically exhausted (5). You could even be having more nightmares (6), some being about the coronavirus-ad jingle. It’s tiring just to read through that list. So many symptoms, but what causes them? Grief for the freedoms we have lost and stress about the future is messing with everyone at the moment. The high levels of stress mimic a post-traumatic stress response while we live through horrible lockdown moments again and again, kicking our sympathetic nervous system into overdrive (7). The sympathetic nervous system is responsible for all things fight-or-flight (or fight-flight-freeze, if you are a psychology nerd), releasing stress-related hormones such as cortisol and adrenaline. Stress over long periods of time, especially over 18 months, is undoubtedly going to take a toll – that toll is seen in lockdown fatigue, with those levels of cortisol building up. The accumulation weakens the immune response, which is why you may be getting colds more often, and it also taps into the brain, altering mood, motivation levels, and the fear response (8). The body’s resources are drained by constant worrying, and even more-so the resources of the mind. With mental fatigue comes lethargy, preventing you from paying attention to those lectures that feel longer than Lord of the Rings: Return of the King. It is a ripple effect: lethargy turns to apathy and stress, stress leads to frustration when the internet drops out for the 100th time during the lecture, frustration leads to further fatigue, to sadness… Everything has a cause and a consequence. There are ways to combat lockdown fatigue, so do not think that it is the end of the world, even though it may seem like it. One of the key symptoms of lockdown fatigue is an overflow of emotions. The rush of feelings (or lack thereof) can often cause distress on its own, so it is important to accept that there is nothing wrong with feeling the way you do (9). Analysing and criticising your emotions will do more harm than good, so try to be nice to yourself! Dr Luana Marques, a psychiatrist and associate professor at Harvard Medical School, reminds her students at that, “however you may be feeling is valid in its own right (10).” Take it easy. Learn to love yourself. Mindfulness is a commonly recommended method of staying in touch with your mind and body (11). Whether it is journaling, meditating, or yoga, any mindfulness activity can strengthen the prefrontal cortex – responsible for thought processes and self-control – increasing your resilience and your ability to pay attention to your surroundings (12). If you notice that you are beginning to be overwhelmed by your emotions, change your focus (13). Think about everything that you have achieved, as small as it may be. Perfected your sourdough? Amazing. Taught your dog some new tricks? Get that on TikTok. Made your bed this morning? Go you! It does not need to be something extravagant, like making a new spacecraft; any accomplishment is something to be proud of, no matter how small. Many of us are also missing social contact, so say hello to your neighbours or get on FaceTime with your friends. Maintaining relationships is fundamental to breaking through the overwhelming uncertainties and negative emotions that come with lockdowns (14, 15). Finally, as much as you may want to, avoid staying bed in bed the whole day. Staying in bed will only give those annoying thoughts a chance to come crashing down (16). Instead, go outside and see some natural light. Natural light will help maintain your circadian rhythm – the cycle which decides when you feel tired and when you are pumped with energy – and make you feel better (17). So go ahead. Make a routine and take back a little bit of control. Start doing downward dogs and turning into a pretzel. Get this bread. COVID-19 and lockdowns have found a way to disrupt so many aspects of our lives, but ultimately, we decide how we approach it, though we may need a little bit of help. Lookout for yourself, and for your friends and family. The fact that you are resilient enough to still be here is testimony to your strength. If you can live through this chaos, you can live through anything. If at any time you feel or have felt concerned about your mental well-being, please consult a GP or contact any of the following services: Suicide Call Back Service: 1300 659 467 or suicidecallbackservice.org.au; Lifeline: 13 11 14 or lifeline.org.au; Beyond Blue: 1300 22 4636 or beyondblue.org.au; MensLine Australia: 1300 78 99 78 or mensline.org.au; or the University’s CAPS: 03 8344 6927 for an appointment, or 1300 219 459 for emergency support. References: 1, 2, 5, 9, 14. Australian Psychological Society. Managing lockdown fatigue. Victoria: The Australian Psychological Society Limited, 2020. 3, 10, 12. Marques, Luana, and Waldinger, Robert. “Overcoming Quarantine Fatigue.” Massachusetts General Hospital. Published June 2, 2020. https://www.massgeneral.org/news/coronavirus/quarantine-fatigue . 4. Labrague, Leodoro J., and Ballad, Cherry Ann. “Lockdown fatigue among college students during the COVID-19 pandemic: Predictive roles of personal resilience, coping behaviors, and health.” Perspectives in Psychiatric Care 57, no. 3 (Mar 2021): 2-6. 6. Silva, Kristian. “Feeling tired during the COVID-19 pandemic? Here’s how you can improve your energy and motivation levels.” ABC News, September 9, 2020, 8:21 a.m. AEST, https://www.abc.net.au/news/2020-09-09/fatigue-during-covid-19-pandemic-how-to-lift-energy-motivation/12640002 . 7. Victorian Institute of Forensic Mental Health. "Lockdown fatigue amid Lockdown 6.0." Published August 2021. https://www.forensicare.vic.gov.au/lockdown-fatigue-amid-lockdown-6-0/ . 8, 15. Mayo Clinic. “Chronic stress puts your health at risk.” Published July 2021. https://www.mayoclinic.org/healthy-lifestyle/stress-management/in-depth/stress/art-20046037 . 11, 13. Beyond Blue, “Lockdown regrets? Focus on what you did do.” Published 2020. https://coronavirus.beyondblue.org.au/managing-my-daily-life/coping-with-isolation-and-being-at-home/lockdown-regrets-focus-on-what-you-did-do.html .

  • The Rise of The Planet of AI | OmniSci Magazine

    The Rise of The Planet of AI By Ashley Mamuko When discussing AI, our minds instinctively fear of sentience and robotic uprising. However, is our focus misplaced on the “inevitable” humanoid future when AI has become ubiquitous and undetectable in our lives? Edited by Hamish Payne & Katherine Tweedie Issue 1: September 24, 2021 Illustration by Aisyah Mohammad Sulhanuddin On August 19th 2021, Tesla announced a bold project on its AI Day. The company plans to introduce humanoid robots for consumer use. These machines are expected to perform basic, mundane household tasks and streamline easily into our everyday lives.With this new release, the future of AI seems to be closing in. No longer do we stand idle, expecting the inevitable humanoid-impacted future. By 2022, these prototypes are expected to launch. It seems inevitable that our future would include AI. We have already familiarised ourselves with this emerging technology in the media we continue to enjoy. Wall E, Blade Runner, The Terminator, and Ex Machina are only a few examples of the endless list of AI-related movies, spanning decades and detailing both our apprehension and acceptance through multiple decades. Most of these movies portray these machines as sentient yet intrinsically evil, as they pursue human destruction. But to further understand the growing field of study of AI, it’s important to first briefly introduce its history and procurement before noting the growing concerns played up in the Hollywood Blockbusters. The first fundamental interpretations of Artificial Intelligence span a vast period of time. Its first acknowledgement may be attributed to the 1308 Catalan poet and theologian Ramon Llull. His work Ars generalis ultima (The Ultimate General Art) advanced a paper-based mechanical process that creates new knowledge from a combination of concepts. Llull aimed to create a method of deducing logical religious and philosophical truths numerically. In 1642, French mathematician Blaise Pascal invented the first mechanical calculating machine; the first iteration of the modern calculator (1). The Pascaline, as it is now known, only had the ability to add or subtract values using a dial and spoke system (2). Though these two early ideas do not match our modern perceptions of what AI is, they lay the foundation of pushing logical processes to do more than just mechanical means. These two instances in history foreshadow the use of mechanical devices in performing human cognitive functions. Not till the 1940s and early 1950s did we finally obtain the necessary means of more complex data processing systems. With the introduction of computers, the novelty of algorithms created a more streamlined function of storing, computing, and producing. In 1943, Warren McCulloch and Walter Pitts founded the idea of artificial neural networks in their paper “A Logical Calculus of Ideas Immanent in Nervous Activity” (3). This presented the notion of computers behaving similar to a human mind and introduced the subsection of “deep learning”. Alan Turing proposed a test to assess a human’s ability to differentiate between human behaviour and robotic behaviour. In 1950, the Turing Test (later known as the Imitation Game) asked participants to identify if the dialogue they were engaging with was with another person or a machine (4). Despite the breakthroughs made in this expertise, the term Artificial Intelligence wasn’t finally coined till 1955 by John McCarthy of AI. Later on, McCarthy along with many other budding experts would hold the famous 1956 Dartmouth College Workshop (5). This meetup of a few scientists would later be pinpointed in history as the birth of the AI field. As the field continued to grow, more public concerns were raised alongside the boom of science fiction literature and movies cropping up. The notorious 1968 movie 2001: A Space Odyssey shaped such a role into the public perception of the field that by the 1960s and 1970s, an AI Winter occurred. Very little notable progress was made in the field due to the lack of funding based on fear (6). Finally after some time had passed and some more advancements were made with algorithm technology, the notable Deep Blue chess game against Gary Kasparov. The event occurring in May 1997 where the Deep Blue robot beat world champion chess superstar Gary Kasparov marked a silence ushering of perhaps a “decline in human society” at the fall of the machine. Fast forward to now, AI has traversed through leaps and bounds to achieve a much more sophisticated level of algorithms and machine learning techniques. To further understand the uses of AI, I interviewed Dr Liz Sonenberg, a professor in the School of Computing and Information Systems at The University of Melbourne and is a Pro Vice-Chancellor (Research Infrastructure and Systems) in Chancellery Research and Enterprise. She’s an expert in the field and has done a multitude of research. "Machine learning is simply a sophisticated algorithm to detect patterns in data sets that has a basis in statistics." With this algorithm, we have been able to implement it in a variety of our daily tech encounters. AI sits behind the driving force of Google Maps and navigation, as well as voice control. It can easily be found anywhere. “Just because these examples do not exhibit super intelligence, does not mean they are not useful,” Dr Sonenberg explains. Dr Sonenberg alludes that the real problem with AI lies within it’s fairness. These “pattern generating algorithms” at times “learn from training sets not representative of the whole population, which can end up with biased answers.” With a flawed training set, a flawed system is in place. This can be harmful to certain demographics and cause a sway on consumer habits. With AI-aided advice, the explanation behind outcomes and decisions are not supported either. Algorithms are only able to mechanically produce an output, but not explain them. With more high-stakes decisions untrusted upon the reliability of AI, the issue of flawed algorithms becomes more pronounced. With my interview with Dr Sonenberg, not one moment was the fear of super-intelligence, robot uprisings, and the likes brought up... With the new-found knowledge of AI’s current concerns I brought up with Dr Sonenberg, I conducted another interview with Dr Tim Miller, a Professor of Computer Science in the School of Computing and Information Systems at The University of Melbourne, and Dr Jeannie Paterson, a Professor teaching subjects in law and emerging technologies in the School of Law at The University of Melbourne. They both are also Co-Directors at The Centre for Artificial Intelligence and Digital Ethics (CAIDE). As we began the interview, Dr Miller explained again that AI “is not magic” and implements the use of “math and statistics”. Dr Paterson was clear to bring up that anti-discrimination laws have been in place but as technology evolves and embeds itself more into public domain, it must be scrutinised. The deployment of AI can easily cause harm to people due to systems not being public, causing sources to be difficult to identify and causily attribute. With the prospect of biased algorithms, a fine dissonance occurs. Dr Miller elaborated on the use of AI in medical imaging used in private hospitals. As private hospitals tend to attract a certain echelon of society, the training set is not wholly representative of the greater population. “A dilemma occurs with racist algorithms… if it is not used [outcomes] could be worse.” When the idea of a potential super-intelligent robot emerging in the future was brought into conversation, the two didn’t seem to be very impressed. “Don’t attribute superhuman qualities [to it],” says Dr Paterson. Dr Miller states that the trajectory of AI’s future is difficult to map. Predictions in the past of how AI progresses with it’s abilities have occurred, but they occur much later than expected… easily decades later. The idea of super-intelligence also poses the question on how to define intelligence. “Intelligence is multidimensional, it has its limits,” says Dr Miller. In this mystical future world of AI, a distinction is placed not just on, “what will machines be able to do but what will not have them do,” states Dr Miller. “This regards anything that requires social interaction, creativity and leadership”; so the future is aided by AI, not dictated by it. However, in a more near future, some very real concerns are posed. Job security, influence on consumer habits, transparency, law approach, and accountability are only a few. With more and more jobs being replaced by machines, every industry is at stake. “Anything repetitive can be automated,” says Dr Miller. But this does not instinctively pose a negative, as more jobs will be created to further aid the use of AI. And not all functions of a job can be replaced by AI. Dr Paterson explains with the example of radiology that AI is able to diagnose and interpret scans, but a radiologist does more than just diagnose and interpret on a daily basis. “The AI is used to aid in the already existing profession, not simply overtake it.” Greater transparency is needed in showing how AI uses our data. “It shouldn’t be used to collect data unlimitedly,” says Dr Paterson, “is it doing what’s being promised, is it discriminating people, is it embedding inequality?” With this in mind, Dr Paterson suggests that more law authorities should be educated on how to approach topics regarding AI. “There needs [to be] better explanation… [We] need to educate judges and lawyers.” With the notorious Facebook-Cambridge Analytica scandal of 2018, the big question of accountability was raised. The scandal involved the unwarranted use of data from 87 million Facebook users by Cambridge Analytica which served to support the Trump campaign. This scandal brought to light how the data we used can be exploited nonconsensually and used to influence our behaviours, as this particular example seemed to sway the American presidential election. Simply put, our information can be easily exploited and sent off to data analytics to further influence our choices. This creates the defence that apps “ merely provide a [service], but people use [these services] in that way,” as said by Dr Miller. Simply put, the blame becomes falsely shifted onto the users for the spread of misinformation. The impetus, however, should lie with social networking sites disclosing to it’s users more transparency on their data usage and history as well as providing adequate protection on their data. To be frank, the future of robotic humanoid AI integrating seamlessly into human livelihoods will not occur within our lifetimes, or potentially even our grandchildren’s. The forecast seems at best, unpredictable; and at worst, unattainable due to the complexity of what constitutes full “sentience”. However, this does not indicate that AI lies dormant within our lives. The fundamental technology based in computing, statistics, and information systems lays most of the groundwork for most transactions we conduct online, whether monetary or social or otherwise. AI and it’s promises should not be shunted aside due to the misleading media surrounding it’s popularised definition and “robot uprisings” but rather taught more broadly to all audiences. So perhaps Elon Musk’s fantastical ideas of robotic integration will not occur by 2022 but the presence of AI in modern technologies should not go unnoticed. References: 1. "A Very Short History of Artificial Intelligence (AI)." 2016. Forbes. https://www.forbes.com/sites/gilpress/2016/12/30/a-very-short-history-of-artificial-intelligence-ai/?sh=38106456fba2. 2. “Blaise Pascal Invents a Calculator: The Pascaline.” n.d. Jeremy Norma's Historyofinformation.com. https://www.historyofinformation.com/detail.php?id=382. 3, 4, 6. “History of Artificial Intelligence.” n.d. Council of Europe. https://www.coe.int/en/web/artificial-intelligence/history-of-ai. 5. Smith, Chris, Brian McGuire, Ting Huang, and Gary Yang. 2006. “The History of Artificial Intelligence,” A file for a class called History of Computing offered at the University of Washington. https://courses.cs.washington.edu/courses/csep590/06au/projects/history-ai.pdf.

  • The Intellectual's False Dilemma | OmniSci Magazine

    The Intellectual’s False Dilemma: Art vs Science By Natalie Cierpisz The age-old debate once again resurfaces. How do art and science truly interact? Is one dependent on the other? How does the ‘art intellectual’ embrace science, and how does the ‘science intellectual’ embrace art? Is this all a meaningless debate anyway? Edited by Andrew Lim, Mia Horsfall & Hamish Payne Issue 1: September 24, 2021 Illustration by Casey Boswell The autumnal Melbourne wind whistles through the naked plane trees lining South Lawn, the sky is flat and grey. Two individuals who regard themselves and only themselves as ‘intellectual paragons’ are seated on a somewhat uncomfortable wooden bench, a perfect perch for people-watching, yet they are rather egotistical and notice only their own presence. One carefully places down their black coffee to light a hand-rolled cigarette; they are a liberal arts intellectual. As the wind grows stronger, the other tightly wraps a lab coat around themselves, and pushes a pair of wire-rimmed spectacles up their nose for the nth time. This would be our scientist. “So, are you still fooling around with your test tubes and pretty lights?” asks the liberal arts academic, cigarette hanging out the corner of their mouth. “If you mean, am I still investigating antiprotons using laser spectroscopy, then yes, indubitably so. How’s your fooling around with Hegel going?” replies the scientist, again pushing their glasses back up to a suitable height. The liberal arts intellectual is quick to retort the scientist’s trite remarks - they are in fact composing a Hegelian analysis of The Communist Manifesto, and not ‘fooling around’ by any means. The tension between the two self-professed intellectuals is building. The two appear to be fighting for dominance in their passive attacks on ego. So goes the age-old feud between the arts and the sciences. These two shallow characters play into the false dilemma that science and art are separate, distinct, alien. Two polar opposites. A total and unequivocal dichotomy. In all fairness, it is difficult to imagine many people will take this polarised a stance on the relationship between art and science. And now, as we delve into the complex relationship between the two domains, it should become clear that science and art are functionally interdependent (1), and considering art and science as totally separate is simply absurd. Let’s get back to our two feuding intellectuals. There seems to be much stereotypical disjunction between the two. But how does this translate to the true relationship between art and science? If the liberal arts intellectual and scientist were not so wrapped up in their self-interested ways, perhaps their gaze would slowly drift to the grandiose arches and imposing columns of the Old Quad. The harmonious form and mathematical ratios of these monuments are an enduring reminder of the architectural leaps and bounds made in the early 14th century, a blended pursuit of art and science. Ergo, we will head to one of the greatest paradigm shifts in Western history – the Renaissance. The Renaissance roughly spanned from the 14th to the 17th century and was a period of complete intellectual revolution – for both science and the arts (2). Everyone is familiar with Leonardo da Vinci, the great Renaissance artist. Less people know that he was also an inventor and a man whose artistic practice was heavily influenced by science (3). To ensure his paintings were as realistic as possible, Da Vinci dissected cadavers to better understand human anatomy, and studied optics and astronomy to perfect his use of space and form in paintings like The Last Supper. Likewise, scientists like Nicholas Copernicus and Galileo Galilei kickstarted a revolutionary paradigm shift towards the heliocentric model, their work in optics and astronomy being heavily reflected in artworks of the same era. Both science and art challenged what was for centuries prior considered the status quo. Source: Leonardo da Vinci, The Last Supper, 1498, tempera on gesso, pitch, and mastic, 460 cm × 880 cm, Wikipedia, https://en.wikipedia.org (4). This certainly isn’t a call for readers to head to the Melbourne General Cemetery and begin digging up specimens, nor to transfer to a double degree in fine arts and biomedicine. Instead, the point is more about how fruitful interaction between the two domains can be, and how one requires the other to flourish. Returning briefly to South Lawn, the snarky liberal arts intellectual continues looking bored and takes out their copy of The Myth of Sisyphus. Sitting directly opposite them the scientist has gone back to finishing the latest New Scientist podcast and calculating a quantum theory of gravity. We have seen that science can inspire art, but how can art inspire science? “The greatest scientists are artists as well.” (5) So said perhaps the most well-known scientist of the modern century. Not only did Albert Einstein develop the special and general theory of relativity (we won’t get into the mathematical specifics for both our sakes), he was also a talented violinist and pianist. Einstein often credited his artistic side for his success in science, testifying that, "the theory of relativity occurred to me by intuition, and music is the driving force behind this intuition. My parents had me study the violin from the time I was six. My new discovery is the result of musical perception.” (6) We have already seen how science prompts art to create new visions, and Einstein was no exception. His revolutionary ideas about space and time have been acknowledged as a prime artistic influence for Picasso’s arguably infamous Cubist style, as well as for the Surrealist art movement. (7) But the arts are not just confined to visual and musical expression. How about the area of expertise of our liberal arts friends? Liberal arts as they are known today, include sociology, literature, philosophy, psychology, politics, and more. The knowledge and, most importantly, critical thinking that is learnt through humanistic education is perhaps key to the future of science. As the world changes and evolves, humans must change and evolve with it, creating innovative solutions along the way. If we shift our focus to around the 1st century BCE, we will encounter what is widely regarded as the coining of the term artes liberales, or liberal arts. Roman statesman, scholar and writer Marcus Tullius Cicero wrote extensively about a wide array of topics, from politics and education to Stoic philosophy. “Artes liberales” roughly translates to “subjects worthy of a free person” - academic study that would enable one to actively participate in society (8). This curriculum consisted of a focus on seven key disciplines of rhetoric, geometry, grammar, music, astronomy, arithmetic, and logic. Liberal arts by nature are not the antithesis of science. From the crux of the artes liberales evolved the study of mathematics, physics, philology, history, and so on. Today we have reached a point where these seven disciplines have evolved and branched out so expansively that we have lost sight of the fact that our modern-day science and arts curriculums are sown from the same seed. Both science and art stem from the real world. Simply put, science is a lens into the study of this world and the inhabitants within it. Art is another lens into this complex system, providing a different but equally valuable perspective. Life is not binary, so neither should be our approach to studying it, and by virtue studying ourselves. Now is the time to embrace such transdisciplinary thinking. We need to bridge the gap between rigorous climate science facts and currently inadequate policy making, assess the ethics of the future of gene-editing, and ultimately become better thinkers. The combined intellectual strength of analytical thinking associated with science, where we learn how to test hypotheses, interpret data and draw valid conclusions; and the arts, where we learn critical thinking, how to develop arguments, how to understand a diverse audience, is necessary to keep humanity’s head above water as our world rapidly changes. Take for example the future of the CRISPR-Cas9 editing tool. This enzyme-based tool allows scientists to remove or add sections of DNA sequence in our genome, our code for life. With this ‘hand of God’ comes great responsibility. Collaboration needs to be made between scientific thinkers and humanistic thinkers to identify what type of robust legislation needs to be implemented to ensure ethical use of this tool. It is no longer a case of scientists working in isolation in underground bunkers. Scientists are making huge strides in research that extend to and greatly impact the wider community. Cases like CRISPR-Cas9 demand a lens from science and a lens from the arts in order to see the full picture – and in this case, to ensure the ethical and safe practise of a tool that has potential to save lives and improve individuals’ quality of life – but this only happens if science and art function in harmony. So back to you, the reader. Perhaps think about enrolling in that philosophy breadth subject next semester that your liberal arts friend raves about. Pick up that popular science book you have been eyeing off at Readings on Lygon St. Listen to that science podcast that keeps popping up on your Spotify homepage (The BBC’s The Infinite Monkey Cage is excellent). Pick up that paintbrush. Go visit Science Gallery Melbourne, a recent art scene addition affiliated with University of Melbourne – how fitting! This isn’t Romeo and Juliet, where you are either a Capulet or a Montague. Rather, this is a case of wave-particle duality, where an electron is both a wave and a particle, and you are both an artist and a scientist. As the typical Melbourne wind continues to pick up and the Old Arts clocktower strikes 7:00 pm, it appears the liberal arts intellectual just swapped their copy of The Myth of Sisyphus for the scientists’ copy of Brief Answers to the Big Questions. Looks like they’re making progress. References: 1. Richmond, Sheldon. “The Interaction of Art and Science.” The MIT Press 17, no. 2 (1984): 81-86. https://www.jstor.org/stable/1574993 . 2. History.com Editors. “Renaissance.” History.com. April 4, 2018. https://www.history.com/topics/renaissance/renaissance . 3. Powers, Anna. “Why Art is Vital to the Study of Science.” Forbes. July 13, 2020. https://www.forbes.com/sites/annapowers/2020/07/31/why-art-is-vital-to-the-study-of-science/?sh=7dfd8f8942eb . 4. Da Vinci, Leonardo. The Last Supper. 1498. Tempera on gesso, pitch, and mastic. 460 cm × 880 cm. Wikipedia. https://en.wikipedia.org . 5, 6. Root-Bernstein, Michelle. “Einstein On Creative Thinking: Music and the Intuitive Art of Scientific Imagination.” Psychology Today. March 31, 2010. https://www.psychologytoday.com/au/blog/imagine/201003/einstein-creative-thinking-music-and-the-intuitive-art-scientific-imagination . 7. Muldoon, Ciara. “Did Picasso know about Einstein?” Physics World. November 1, 2002. https://physicsworld.com/a/did-picasso-know-about-einstein/ . 8. Tempest, Kathryn. “Cicero’s Artes Liberales and the Liberal Arts.” Ciceronian on Line 4, no. 2 (2020): 479-500. https://doi.org/10.13135/2532-5353/5502 . Feynman, Richard, P. The Pleasure of Finding Things Out: The Best Short Works of Richard P. Feynman. New York: Basic Books, 2005. Science Gallery Melbourne. “Inspiring and Transforming Curious Minds.” Published 2021. https://melbourne.sciencegallery.com/what-we-do . White, Fiona. “Why art and science are better together.” The University of Sydney News. September 17, 2020. https://www.sydney.edu.au/science/news-and-events/2020/09/17/arts-and-science-better-together.html .

  • Our Microbial Frenemies | OmniSci Magazine

    Our Microbial Frenemies By Wei Han Chong How could it be that some of the smallest organisms known to mankind can hold so much influence and cause such calamity in our lives? The significance of these microorganisms have long eluded the greatest microbiologists. But has our perception of these microbes blinded us to their advantages, if any? Edited by Khoa Anh Tran & Tanya Kovacevic Issue 1: September 24, 2021 Illustration by Rachel Ko Throughout human history, diseases and plagues have amassed death tolls reaching hundreds of millions, if not billions. From the Black Death in the 14th century, which killed about 200 million people, or about 30–50% of Europe’s population, to outbreaks of tuberculosis and typhoid fever, resulting in 1.4 million and 200,000 deaths every year, respectively (1, 2, 3). It should come as no surprise then that we have long perceived these microorganisms as a threat to public health and have consequently sought to eradicate these microbes from our environment. But have we been looking at them the wrong way? First and foremost, we know very little about the microorganisms living around us. In bacterial species alone, some scientists have estimated around a billion species worldwide, though even this value is believed to be a gross underestimation (4). Before the germ theory, the most widely accepted theories were the spontaneous generation and miasma theories. Spontaneous generation was a simple theory, believing that living organisms could develop from nonliving matter, such as maggots developing from rotting flesh. The miasma theory, on the other hand, was more prevalent throughout both ancient and modern history. From this perspective, “toxic” vapours from rotting organisms or unsanitary locations were believed to have caused disease (5). This all changed with the germ theory of disease: an idea that would revolutionise our understanding of microorganisms for centuries to come. First theorised as “invisible seeds” by Italian scholar Girolamo Fracastoro in 1546, Fracastoro believed that these seeds could cause disease when spread from infected to healthy individuals (6). For the most part, the basis of the germ theory would continue to follow this logic of a specific microorganism, a “germ”, that could cause a specific disease when invading its host (7). Yet, it was not until nearly 200 years later that the field of microbiology would see huge developments. In 1861, French scientist Louis Pasteur had disproved the spontaneous generation theory by means of sterilisation and proper sealing of food items, which would prevent microbial growth (8). However, Louis Pasteur would not be the only one contributing to developments in microbiology. In 1884, German scientist Robert Koch would be the first to develop a classification system for establishing a causative relationship between a microorganism and its respective disease, effectively confirming the germ theory of disease (9). Even to this day, Koch’s system is still very much influential in microbial pathogenesis, albeit refined to a higher standard. Now known as Koch’s Molecular Postulates — as opposed to Koch’s Original Postulates — which is a model that places a greater emphasis on the virulence genes causing disease, rather than the microorganism itself (10). Today, while we have much to thank Pasteur and Koch for in laying the foundation of modern microbiology, undoubtedly one of the biggest discoveries in microbiology was the discovery of the human microbiota. When we think of microbial life, we usually think of diseases and plagues, cleanliness and dirtiness. Rarely do we ever consider the idea of microbes living inside and around us. Yet, even less so can we begin to comprehend the sheer number of microorganisms that live and proliferate all around ourselves. In our gastrointestinal tract, estimates suggest that there are some 100 trillion microorganisms encoding three million genes altogether, which is 130 times more than what we encode ourselves (11). Figure 1. Microbes in Food (25) So, what do we know about the microbiota; specifically, our microbiota? Firstly, we know that the microorganisms occupying our gut do not cause disease, under normal circumstances. Secondly, we know that they can provide us with a multitude of benefits, such as helping us digest complex organic molecules, and preventing invasion of foreign microbes by directly competing for resources and keeping the immune system stimulated. These are just a few of the advantages our microbial allies provide us. However, that is not to say that they pose no danger to ourselves either. Typically, these microorganisms are categorised into being in a beneficial, pathogenic or commensal relationship with its host. Beneficial microbes, or probiotics, are as the name suggests: these microbes typically provide some form of health benefit to the host and are usually non-pathogenic. Many of the bacterial species found in our gut lumen, for example, have the capability to digest cellulose. As such, without these microbes, digesting vegetables would be a much harder and less rewarding task. Most of the probiotics found in our microflora are of lactic acid bacteria origin and are most common in diets that incorporate fermented dairy products (12). Pathogenic microbes, on the other hand, mostly describe microbes of foreign origin. These microorganisms will infect and exploit the host’s cells, ultimately causing disease. Commensal microorganisms walk an interesting line, in comparison to beneficial and pathogenic microbes. This group of microbes encompasses all of the characteristics described above, depending on circumstance. This ranges from benefiting both the host and microbe, the microbe itself, or even causing disease within its host when given the opportunity. An example of a commensal microorganism is Escherichia coli, or E. coli. It is a bacterium that colonises our gastrointestinal tract as soon as we are born, where it fends off more than 500 competing bacteria species, thanks to its versatility and adaptations to our gut environment (13). Furthermore, the presence of E. coli along our gut epithelium helps to stimulate mucin production, inhibiting any foreign microbes from invading the epithelium (14). However, as is typical of a commensal organism, when given the chance, E. coli is capable of causing intestinal or extraintestinal disease in our bodies. Urinary tract infections due to E. coli are among the most common causes of a microflora-associated infection and often occur when the bacterium is allowed to enter the urinary tract via cross contamination with the anus, where E. coli is typically shed as part of the faeces (15). Typically, these beneficial and commensal bacteria are found all over our body. They can be found in our hair, on our skin, and as we have discussed, in our gut. Malassezia, for example, is a fungus that colonises our scalp, and is what causes dandruff in most people. While dandruff may be a nuisance to those who experience it, do the disadvantages necessarily outweigh the benefits? The presence of Malassezia on our scalps means that other, possibly dangerous, microorganisms will have to compete with Malassezia in order to invade. Additionally, the stimulation of our body’s defenses due to Malassezia aids in repelling foreign invaders (16). Staphylococcus aureus is another example of a commensal microbe, and an even better example of an opportunistic pathogen that can be found living harmoniously on our skin and nasal passages, helping us fend off other competing microbes just as Malassezia does on our scalp. However, when the skin is pierced, whether by means of injury or even medically through surgeries or treatments, the Staphylococcus bacteria will opportunistically attempt to invade and infect its host (17). As such, Staph infections and outbreaks are among some of the most common forms of hospital-related infections (18). Source: Thomas L Dawson, “What causes dandruff, and how do you get rid of it?” February 10, 2021, Ted-Ed video (19). Looking to the future, we have begun to see a spike in non-communicable diseases as opposed to microorganism-based diseases. These include most forms of heart diseases, cancers, diabetes, and others. Still, while the rise of non-communicable diseases is arguably a cause for concern, the return of long extinct diseases and antibiotic resistant pathogens may prove costly. Staph infections, as previously mentioned, are extremely common in hospital environments where continued usage of antibiotics such as penicillin or methicillin has produced a “super strain” of Staphylococcus that is resistant to most commercially available drugs (20). Currently, superbugs such as multidrug-resistant mycobacterium tuberculosis and methicillin-resistant Staphylococcus aureus are most common in healthcare settings, but community transmissions have become a concern (21). As such, with our current practices of antibiotic overprescriptions and continued reliance on sterilisation, future outbreaks of mutated and resistant pathogens may be inevitable. That being said, should we redefine what “clean and sterile” means to us? Should “sterile” necessarily be a microbe-free environment? Our perception of microbial life has consistently been “antibacterial” and believed to have been a threat to public health ever since the inception of the germ theory. However, the fact of the matter is that these microorganisms are unavoidable. There are microorganisms living all over us. Our fingers, our phones, even the soles on our shoes carry certain microorganisms. In hospital rooms, the composition of microbes is constantly changing as patients and visitors enter and leave (22). Besides, the composition of microbes in the environment is not determined solely by its occupants. Other factors, such as ventilation and even architecture, can determine what microbes we find in our environment. In fact, hospital rooms with more airflow and humidity were found to have suppressed the growth of potential pathogens and had fewer human-associated bacteria in its microbial composition (23). Just as the microbe composition in the environment can be determined by architectural and building factors, the microbe composition in our microflora can hold incredible influence over our physiology. Dysbiosis, an imbalance in our microflora, can occur as a result of repeated consumption of antibiotics, and it is a serious illness resulting in a significant loss of beneficial and commensal microbes (24). Consequently, invasion and colonisation capabilities of foreign pathogens is increased; as has been shown in antibiotic-treated mice exposed to M. tuberculosis, where pathogenic colonisation was promoted when in a dysbiotic state (25). Other factors, such as diet and lifestyle, also contribute as “disturbance” factors that influence dysbiosis, as can be seen in typical Western-style diets that mostly consist of high fatty and sugary foods (26). In the future, while the crises of pandemics originating from drug-resistant superbugs loom over us, our understanding of microbial life has come far; from its humble beginnings as a rejected theory amongst scholars, to the discovery of an extensive microbial ecosystem inside of our guts. Despite that, our comprehension of this “hidden world” remains lacking, and we have yet to fully realise the potential of microbial life. Throughout history we have constantly taken an antimicrobial stance to preserve public health, but in recent times it has become increasingly clear that these microorganisms play a much greater role in health. References: 1. LePan, Nicholas. “Visualizing the History of Pandemics.” Visual Capitalist. Last modified September 2021. https://www.visualcapitalist.com/history-of-pandemics-deadliest/ . 2. World Health Organization. “Tuberculosis.” Published October 2020. https://www.who.int/news-room/fact-sheets/detail/tuberculosis . 3. Centers for Disease Control and Prevention. “Typhoid Fever and Paratyphoid Fever.” Last modified March 2021. https://www.cdc.gov/typhoid-fever/health-professional.html . 4. Dykhuizen, Daniel. “Species Numbers in Bacteria.” Supplement, Proceedings. California Academy of Science 56, no. S6 (2005): 62-71. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3160642/ . 5. Kannadan, Ajesh. “History of the Miasma Theory of Disease.” ESSAI 16, no. 1 (2018): 41-43. https://dc.cod.edu/essai/vol16/iss1/18/ . 6, 8. Greenwood, Michael. “History of Microbiology – Germ Theory and Immunity.” News-Medical. Last modified May 2020. https://www.news-medical.net/life-sciences/History-of-Microbiology-e28093-Germ-Theory-and-Immunity.aspx . 7. Britannica. “Germ theory.” Last modified April 2020. https://www.britannica.com/science/germ-theory . 9, 10. Gradmann, Christoph. “A spirit of scientific rigour: Koch’s postulates in twentieth-century medicine.” Microbes and Infection 16, no. 11 (2014): 885-892. https://doi.org/10.1016/j.micinf.2014.08.012 . 11. Valdes, Ana M, Jens Walter, Eran Segal, and Tim D Spector. “Role of the gut microbiota in nutrition and health.” BMJ 361, no. k2179 (2018): 36-44. https://doi.org/10.1136/bmj.k2179 . 12, 24. Martín, Rebeca, Sylvie Miquel, Jonathan Ulmer, Noura Kechaou, Philippe Langella, and Luis G Bermúdez-Humarán. “Role of commensal and probiotic bacteria in human health: a focus on inflammatory bowel disease.” Microbial Cell Factories 12, no. 71 (2013): 1-11. https://doi.org/10.1186/1475-2859-12-71 . 13, 15. Leimbach, Andreas, Jörg Hacker, and Ulrich Dobrindt. “E. coli as an All-rounder: The Thin Line Between Commensalism and Pathogenicity.” In Between Pathogenicity and Commensalism, edited by Ulrich Dobrindt, Jörg Hacker and Catharina Svanborg, 3-32. Springer: Berlin, 2013. 14. Libertucci, Josie, and Vincent B Young. “The role of the microbiota in infectious diseases.” Nat Microbial 4, no. 1 (2019): 35-45. https://doi.org/10.1038/s41564-018-0278-4 . 15. Harvard Medical School. “When urinary tract infections keep coming back.” Published September 2019. https://www.health.harvard.edu/bladder-and-bowel/when-urinary-tract-infections-keep-coming-back . 16. Saunders, Charles W, Annika Scheynius, Joseph Heitman. “Malassezia Fungi Are Specialized to Live on Skin and Associated with Dandruff, Eczema and Other Skin Diseases.” PLoS pathogens 8, no. 6 (2012): 1-4. https://doi.org/10.1371/journal.ppat.1002701 . 17. Cogen, A. L., V. Nizet, and R. L. Gallo. “Skin microbiota: a source of disease or defence?” British journal of dermatology 158, no. 3 (2008), https://doi.org/10.1111/j.1365-2133.2008.08437.x . 18, 20. Klein, Eili, David L Smith, and Ramanan Laxminarayan. “Hospitalizations and Deaths Caused by Methicillin-Resistant Staphylococcus aureus, United States, 1999–2005.” Emerging infectious diseases 13, no. 12 (2007): 1840-1846. https://doi.org/10.3201/eid1312.070629 . 19. Dawson, Thomas L. “What causes dandruff, and how do you get rid of it?” February 10, 2021. Ted-Ed video, 5:04. https://youtu.be/x6DUOokXZAo . 21. Better Health. “Antibiotic resistant bacteria.” Last modified March 2017. https://www.betterhealth.vic.gov.au/health/conditionsandtreatments/antibiotic-resistant-bacteria#bhc-content . 22, 23. Arnold, Carrie. “Rethinking Sterile: The Hospital Microbiome.” Environmental health perspective 122, no. 7 (2014): A182-A187. https://doi.org/10.1289/ehp.122-A182 . 25. Khan, Rabia, Fernanda C Petersen, and Sudhanshu Shekhar. “Commensal Bacteria: An Emerging Player in Defense Against Respiratory Pathogens.” Frontiers in Immunology 10, no. 1 (2019): 1203-1211. https://doi.org/10.3389/fimmu.2019.01203 . 26. Schippa, Serena, and Maria P Conte. “Dysbiotic Events in Gut Microbiota: Impact on Human Health.” Nutrients 6, no. 12 (2014): 5786-5805. https://doi.org/10.3390/nu6125786 . 27. Sottek, Frank. Microbes in Food. c. 1904. The Tacoma Times, Tacoma. https://commons.wikimedia.org/wiki/File:Sottek_cartoon_about_microbes_in_food.jpg .

  • ​Meet OmniSci Designer Aisyah Mohammad Sulhanuddin | OmniSci Magazine

    Meet OmniSci Designer & Committee Member Aisyah Mohammad Sulhanuddin Aisyah is a designer and Events Officer at OmniSci in her final year of a Bachelor of Science in geography. For Issue 4: Mirage, she is contributing to social media and as an illustrator interviewed by Caitlin Kane What are you studying? I am studying the Bachelor of Science in geography, now in my final year. Do you have any advice for younger students? It’s alright to not know what you’re doing. But on the flipside, if you do feel you know what you’re doing, be very aware that could change in the next few years. Always be open to new options. What first got you interested in science? When I was a kid, my parents encouraged me to ask questions about the world. I also had my own little book of inventions… if there was a problem somewhere, even if it was with the most outlandish invention, I would seek a way to solve that problem. That idea of being able to figure out how the world works is very fascinating to me. How did you get involved with OmniSci? During lockdown, I saw on the bulletin an expression of interest for a new magazine. I’d just entered uni, wanted to try everything and thought why not, it seems like such a great opportunity. And it is! What is your role at OmniSci? I’ve done a lot of graphic design and I’m going to return for this issue in that role. I’ve basically collaborated with writers to make art that looks good, goes with my style and can convey what they want to say in their article. I’m also in the committee for OmniSci, and have been since last year. Within that, I’ve put multiple hats on: I’ve enjoyed organising multiple events for the club, and helping out with social media. Social events have had a great turnout this year, which is awesome. A new year is always a new opportunity for more people to learn about the magazine. What is your favourite thing about contributing at OmniSci so far? I’ve really enjoyed the graphics side of things. I love creating and it’s really awesome to be able to put art to something text-based. It’s interpretation… You’re bound by what the article says and what the science says, but there is freedom within to express something. I definitely enjoy being able to put my creativity into promotion [as a committee member]. Doing it in a way that’s aesthetically pleasing—it matters to me when things look nice! Do you have any advice for people thinking of getting involved, especially more on the committee side? Yes—do it! Come and join… If you’re interested, feel free to come along because no role should be too daunting for you, and there is always opportunity to make the role fit how you want, it’s quite flexible. Can you give us a sneak peak of what you're working on this issue? If there’s a lot to come, maybe you can just tell us where you’re up to in the process. I’ll be working on the design and looking forward to collaborating with the writer as to how to convey their article properly. In the future, I’m looking forward to being able to create more content for OmniSci—really looking forward to that. What do you like doing in your spare time (when you're not contributing at OmniSci)? A range of things—I like to read, edit photos, do graphic design of random illustrations. I also crochet, do a bit of arts and crafts on the side, and take a whole lot of photos. Which chemical element would you name your firstborn child (or pet) after? Wait, let me pull up the periodic table! Let’s see… Neon. Feels like a great name for a child or an animal. Like calling your kid Jaz or Jet. It’s very snazzy! Do you have anything else you’d like to share with the OmniSci community? Stay looking on our Facebook page! Keep in touch and always keep on communicating, consuming and learning more about science, because that’s how the world progresses honestly. See Aisyah's designs Should We Protect Our Genetic Information? The Rise of The Planet of AI Maxing the Vax: why some countries are losing the COVID vaccination race What’s the forecast for smallholder farmers of Arabica coffee? The Ethics of Space Travel Space exploration in Antarctica The Mirage of Camouflage FINAL Big Bang to Black Holes: Illusionary Nature of Time

  • ISSUES | OmniSci Magazine

    Issues Check out previous issues of OmniSci Magazine! Issue 7: Apex Cover: Ingrid Sefton 22 October, 2024 READ NOW Issue 6: Elemental Cover: Louise Cen 28 May, 2024 READ NOW Issue 5: Wicked Cover: Aisyah Mohammad Sulhanuddin 24 Oct, 2023 READ NOW ISSUE 4: MIRAGE Cover: Gemma van der Hurk 1 July, 2023 READ NOW ISSUE 3: ALIEN Cover: Ravon Chew September 10, 2022 READ NOW SUMMER ISSUE 2022: A Year In Science Cover: Quynh Anh Nguyen March 23, 2023 READ NOW ISSUE 2: DISORDER Cover: Janna Dingle December 10, 2021 READ NOW ISSUE 1: Science is Everywhere Cover: Cheryl Seah December 24, 2021 READ NOW

  • Behind the Scenes of COVID-19 | OmniSci Magazine

    Conversations in Science Behind the Scenes of COVID-19 with Dr Julian Druce By Zachary Holloway What will our future with COVID-19 look like? How do we live with it? How could it have been managed better? In conversation with Dr Julian Druce, a renowned expert in the field of virology. Edited by Caitlin Kane & Breana Galea Issue 1: September 24, 2021 Illustration by Janna Dingle Interview with Dr Julian Druce, head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory. Before the middle of 2021, it seemed Australia was finally seeing the back of the COVID-19 pandemic: case numbers were down, the vaccine rollout was gaining momentum and Victoria had defeated the Delta variant twice. Fast forward to today, and the outlook doesn’t appear to be as rosy. Over a year and a half from when the pandemic began, it is still dominating headlines around the world. But like many in Australia, I still had many questions regarding the state of the pandemic, our path out of it and how scientists behind the scenes were shaping our public health response. I sat down in conversation with Dr Julian Druce hoping to find some of the answers to these questions. Zachary Holloway: What was the work you were conducting at the Victorian Infectious Diseases Reference Laboratory (VIDRL) before the COVID-19 pandemic? Dr Julian Druce: VIDRL itself is a public health reference laboratory, with a large focus on virology. For virology there are four main labs: one is a big serology laboratory which tests for antibodies and the footprints that a virus leaves after your immune system has interrogated that pathogen. The other labs are more focused on direct detection of some specific viruses: there’s an HIV-specific lab, a hepatitis-specific lab and then my lab, which focuses on all other viruses. These mostly use very specific PCR (polymerase chain reaction) tests for the detection of the virus. Another option for rapidly detecting viruses that might be new is by having tests that, rather than detecting a specific virus, detect a family of viruses at once. They’re called consensus PCRs or pan-viral PCRs. One of those tests was a pan-coronavirus PCR, and that had been sitting in a freezer for thirteen years, only to be brought out at the start of 2020 when SARS-CoV-2 emerged, and that was the test we used to verify that we had the virus by sequencing the PCR product. ZH: I know that VIDRL was the first lab outside of China to grow SARS-CoV-2 in culture. What was the process for this, and how did this help in developing a standardised test for COVID-19? JD: My boss, Dr Mike Catton, and I had been on WHO [World Health Organisation] teleconference calls all through the preceding weeks where everyone was clamouring for someone to grow the virus. So I immediately put it up for culture on the Friday night when we detected it. This process puts a small amount of patient sample onto cells that may get infected with the virus. I came in on Sunday to check it, and thought something might be happening so put the flask of cells onto a camera that took photos every fifteen minutes. As soon as I checked this on Monday, I knew that it was growing because there was an obvious pattern in the cells that showed they were changing. In terms of having the cultured virus, it was then just a process of getting it out to other labs and collaborators. We gamma-irradiated some material and that material, which is killed, was a good positive control material for other laboratories to use to verify and validate their testing algorithms. Because at that point, there were only self-designed tests for COVID-19 in a few labs. This material was used to help validate all the labs around Melbourne and Australia as commercial tests became available to get them ready for testing. ZH: How important was genome sequencing for our contact tracers to be better able to track and trace the spread of the virus? JD: In general, roughly every two weeks the virus will generate one mutation somewhere. That mutation can be used to track the lineage – a bit like a family tree – and once that mutation goes from, say, me to you, you might get a new mutation when you pass it on to someone else. That mutation then becomes a key identifier for that strain. That really helped in tracking and tracing in the early days, to understand who was probably giving it to whom even though contact tracing can often work that out. Importantly though, at that very early stage we closed our borders to China, but we left our borders open to America and Europe. So as cases were coming in from those countries, we had to do genomic sequencing to verify what strain, or lineage if you like, with key mutations were showing up. We could then readily identify whether the samples were from Europe, America or the Ruby Princess, or from wherever there were new cases coming in. ZH: Has the increased infectivity of the Delta variant of SARS-CoV-2 beaten contact tracers and made Australia’s “COVID zero” strategy unachievable? JD: In terms of “COVID zero”, the national pandemic plan has always been to suppress the virus and flatten the curve, and the public health aim of that is to push the volume of samples down and stretch it out along a timeline axis. You might end up with the same numbers, but it’s stretched out across a year rather than one or two months, which shatters your health system. But what we found early was that with a lot of goodwill and effort from the public, we did eliminate the virus. We didn’t necessarily expect to do that, so that was a lucky event. But with the Delta variant, it does seem that it spreads more efficiently: the calculated reproduction rate for this variant is about 3-4 or more, and about 2-3 for the original wild-type. So this makes it much harder to eliminate. ZH: I think millions of people around the country want to know the answer to this question, but when will lockdowns stop being a viable strategy for containing this virus? Does it come with increasing vaccination, or could it continue after that? JD: It very much depends on what happens as we move forward. Of course, vaccination is the pathway out of this. As more people become vaccinated and less susceptible to serious disease and death, we will slowly transform this virus into a common cold, or at least that’s what is likely to happen. But I suspect that as we open up, if it all goes badly, we may have to have some level of restrictions to mitigate transmission. Some of this is already being discussed with entry passports, and people not being allowed into pubs, theatres, or wherever else there is close confinement in a natural or urban setting, unless they’re double-dosed. ZH: In retrospect, how will we rate the response to this pandemic? Was it proportional to the dangers it posed? JD: I think that will be debated for years. Every country has done it a little bit differently, from the worst end of the scale to the best end of the scale. Australia is probably on the better end, in terms of suppressing and eliminating the virus, but we haven’t done as well with the vaccine rollout. We’re getting there now – we’re catching up – but I think, generally, Australia will be viewed favourably as having had a good response. In Australia there’s a double-edged sword with vaccination uptake because we didn’t have the carnage that other countries had.. But now that we’ve got the virus circulating again, that has prompted a greater uptake of the vaccine, which is a good thing. Outside of Australia, I imagine the World Health Organisation will do an analysis of the generalised responses of different countries: from some of the poorer performers – like America and other countries that decided to let it rip, thinking that herd immunity was the best option – to the responses of other countries, mainly severe lockdowns, who suppressed and eliminated the virus. There are still many types of parameters to look at, from economic and socioeconomic to virological and epidemiological, a lot of elements still to tease apart when this is all done. Dr Julian Druce is the head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory, where he works with a team to detect many of the viruses that infect humans and devises new ways to detect novel viruses. We would like to thank Dr Druce for taking the time to meet with us and discuss his work.

  • Bionics: Seeing into the Future | OmniSci Magazine

    Bionics: Seeing into the Future By Joshua Nicholls While the Bionic Eye might seem like a technology of the far future, exciting advancements are being made in the field of visual prostheses. This piece points a keen eye at emerging treatments for some of the most prominent diseases, along with their possible bionic treatments. Issue 1: September 24, 2021 Illustration by Friday Kennedy Visual prostheses, colloquially known as bionic eyes, are a set of experimental devices designed to restore — or partially restore — vision to those with varying levels of blindness (1). While once viewed as “science fiction”, these technologies are becoming a reality for thousands of Australians with visual impairments. Since its inception in 1956 by the Australian inventor Graham Tassicker (2), the idea of restoring vision using electronics has undergone several developments, ranging from rudimentary cortical stimulation to modern advancements in state-of-the-art retinal implants. As of 2018, it was estimated that over 13 million Australians have some form of visual impairment. Of these 13 million, 411,000 have cataracts or the clouding of the lens; 244,00 have macular degeneration, which degrades fine detail vision; and 133,000 are either partially or entirely blind (3,4). The economic burden of blindness in Australia is substantial. In 2009, it was estimated that the total cost of vision loss per person aged 40 and over was $28,905 — a nationwide total of 16.6 billion AUD (5). Figure 1: Categorisation of Total Economic Cost of Vision Loss in 2009 (5) Age-related macular degeneration (AMD) is one condition for which visual prosthetics may be applicable. AMD refers to the irreversible loss of high-acuity, colour-sensitive cone cells in the centre field of vision. This structure of the retina is responsible for reading, recognising faces, driving, and other visual tasks that require sharp focal vision. In fact, you are using these cells to read this article right now. Its typical onset is later in life, affecting 12% of people aged 80 or over (6). As the leading chronic eye condition for elderly Australians (7), it accounts for 48% of all cases of blindness nationwide (8). According to AIHW4, there is also a higher prevalence amongst females than in males — between 4.9%–6.8% and 3.6–5.1%, respectively. Macular degeneration exists in two forms: dry and wet. Dry macular degeneration is caused by thinning of the macula; it is the most common form of the disease and progresses slowly over many years. Wet macular degeneration is a potentially more severe variation of the disease which is caused by the sudden development of leaky blood vessels around the macula (9). With no known cure — and most treatments being directed towards prevention and delaying progression — interventions relying on prosthetics may be the best hope for the restoration of lost eyesight (10). Graham Tassicker was the first to realise the potential utility of cortical stimulation in restoring sight to those with vision loss. In 1956, Tassicker developed a photosensitive selenium cell which, when placed behind the retina, resulted in phosphene visualisation — the phenomenon of seeing light without light actually entering the eye (2). This was the first evidence of non-cortical stimulation to elicit visual experience. It was in the 1990s that visual prostheses took a radical development; sophisticated retinal surgeries and the creation of biomaterials led to a surge of novel inventions, including cortical implant miniaturisation and artificial retinas — the latter of which is the most advanced to date. There is currently a state-of-the-art retinal bionic system that has recently undergone clinical trial research: the Argus II Retinal Stimulation System. The Argus is an epiretinal (above the retina) implant which has been designed by SecondSight; as of 2013, it was FDA approved for retinitis pigmentosa (RP) but has potential utility for dry AMD. It consists of a device that is implanted in the patient’s eye and an external processing unit worn by the user. The system consists of sixty electrodes, each of which is two-hundred-micrometres in diameter. Images that have been captured by a small camera on glasses are converted into electrical impulses to stimulate surviving ganglion cells on the retina. It is currently the most widely used retinal prosthetic system in the world, with more than 350 RP patients being treated to date. The cost of this device is 150,000 USD — a price that excludes surgery and post-operative training (11). Figure 2: The design of the Argus II (12) In 2015, a case study was performed by the Argus II study group on the impact the implant would have on restoring visual function to subjects who had complete blindness from RP. The results from this study were quite promising; it showed that of the 30 patients who received the Argus II system, all significantly performed better on a white square test than they did without the prosthesis. (None of the subjects scored any points with the device absent.) The Argus also showed reliability for 29 subjects, all of whom still had functioning devices after three years (13). In 2020, a clinical trial of this device for dry AMD was completed. The study, which consisted of five patients, assessed the safety and feasibility of the device. According to Mills et al. (14), no patients reported confusion when operating the Argus alongside their healthy peripheral vision. Adverse events occurred in two patients who experienced proliferative vitreoretinopathy — or tractional retinal detachment. However, due to recent events surrounding the COVID-19 pandemic, the company declared that they would be performing “an orderly wind-down of the company’s operations”. SecondSight is now focusing on a new device: The Orion. This device is designed to stimulate the visual cortex of the brain — a return to the original conception of visual prosthetics. The Orion is planned to expand the pool of patients who are eligible for visual prosthetics. It will essentially bypass the requirement for healthy ganglion cells and a functioning optic nerve, which retinal prosthetics require. The only forms of blindness not encompassed by this technique are congenital forms of blindness or people who are ‘cortically blind’ from suffering damage to the visual cortex area V1. The Orion is modelled after the Argus II with its 60 cortical-stimulating electrodes receiving input from a camera on the user’s glasses. Under the Breakthrough Device Pathway, the FDA approved Orion for an early feasibility study. Six human subjects have been fitted with the device — one woman and five men between the ages of 29 and 57. Of these six, one had endophthalmitis, two had glaucoma, and three suffered trauma. After one year of wearing the device, four of the patients could accurately discern the location of a palm-sized white square on a computer screen, and five could locate its movement in space. The Orion has shown a good safety profile after 12 months of use, and follow-ups on its progress will occur for five years (15). Visual prostheses have a promising and bright future of development ahead of them. While it is still in its infancy, the results of ongoing clinical trials show promise for sight restoration. With multiple models and modes of intervention available, artificial vision is slowly becoming a reality for the visually impaired, but further developments in the field are still required. It would be promising to see advancements from mere two-dimensional grey-scale images to the rich, three-dimensional, and full-colour experience that we take for granted as normal vision. For now, two essential factors need to be improved for the full realisation of artificial vision: cost and electrode density. The Argus costs 150,000 USD — an expense that excludes surgery and training. This figure may be unfeasible for the thousands of Australians who would benefit from such a device. If the current trend of Moore’s Law continues, electrode density will increase whilst the cost of the device will decrease — a trend analogous to the increase in power and improved price of computers in the last century. This pixel density will hopefully improve to the point of achieving near-normal visual acuity. The 60 pixels, while helpful in regaining some functionality, cannot compare to the some 96 million photoreceptor cells in the retina — 5 million of which are located in the cone-dense macula. Nevertheless, artificial vision is an exciting and innovative technology currently under development. While much research is still needed, further advancements in bionics will one day make visual prosthetics a ubiquitous and affordable technology to those in need. About the writer: Joshua Nicholls was the 2021 winner of the Let's Torque competition. Joshua : I am a 5th-year neuroscience and biochemistry student at the Swinburne University of Technology. I finished my Health Science degree a few years ago, majoring in neuroscience. I am now completing my final few subjects in my Bachelor of Science, with biochemistry as my major. For the state-wide Let’s torque competition, I changed my pitch to artificial vision, hence its title, Bionics: Seeing into the Future—a catchy pun, if I do say so myself. I made the rather complex topic of visual prosthetics approachable and understandable to the general audience by, as stated previously, conveying a story. I asked my audience to consider losing vision, if not completely, at least partially. Considering this, I then asked them to imagine what life must be like for the some 13 million Australians of whom suffer from some form of visual impairment. This exercise brought home the very real phenomenon of visual impairment, which many of us have—or will—be impacted. The solution for currently untreatable vision loss is already underway: The Bionic Eye, as it is colloquially known. While it may sound like science fiction, bionics (or prosthetics) are nothing new; artificial hearing through the cochlear implant and artificial limbs are becoming rather ubiquitous. I briefly detailed a few diseases for which visual prosthetics may be appropriate, such as age-related macular degeneration and retinitis pigmentosa, and spoke about past and current clinical trials demonstrating their efficacy. To end my pitch, I talked about the lasting impact these devices will have on people’s lives and the future developments required. In doing so, I relayed the past, present, and future of the bionic eye, which detailed a coherent and relatable story to my audience. I was successful in my pitch and won first place among the state! It was an absolute privilege even to have been a part of this competition; coming first place was an added honour and will remain one of the highlights of my life. I believe this experience will serve as a footstone toward my career in science and science communication. If anyone has any desires to get their foot in the door of this field, get your name and face out there and just go for it! References: Ong, J. M., & da Cruz, L. (2012). The bionic eye: a review. Clinical & experimental ophthalmology, 40(1), 6-17. Tassicker, G. (1956). Preliminary report on a retinal stimulator. The British journal of physiological optics, 13(2), 102-105. Australian Bureau of Statistics. (2018). National Health Survey: First Results, 2017–18. Canberra: ABS Retrieved from https://www.abs.gov.au/statistics/health/health-conditions-and-risks/national-health-survey-first-results/latest-release Australian Institute of Health and Welfare. (2021). Eye health. Canberra: AIHW Retrieved from https://www.aihw.gov.au/reports/eye-health/eye-health Taylor, P., Bilgrami, A., & Pezzullo, L. (2010). Clear focus: The economic impact of vision loss in Australia in 2009. Vision2020. Retrieved from https://www.vision2020australia.org.au/wp-content/uploads/2019/06/Access_Economics_Clear_Focus_Full_Report.pdf Mehta, S. (2015). Age-related macular degeneration. Primary Care: Clinics in Office Practice, 42(3), 377-391. Foreman, J., Xie, J., Keel, S., van Wijngaarden, P., Sandhu, S. S., Ang, G. S., . . . Taylor, H. R. (2017). The prevalence and causes of vision loss in Indigenous and non-Indigenous Australians: the National eye health survey. Ophthalmology, 124(12), 1743-1752. Taylor, H. R., Keeffe, J. E., Vu, H. T. V., Wang, J. J., Rochtchina, E., Mitchell, P., & Pezzullo, M. L. (2005). Vision loss in Australia. Med J Aust, 182(11), 565-568. doi:10.5694/j.1326-5377.2005.tb06815.x Calabrese, A., Bernard, J.-B., Hoffart, L., Faure, G., Barouch, F., Conrath, J., & Castet, E. (2011). Wet versus dry age-related macular degeneration in patients with central field loss: different effects on maximum reading speed. Investigative ophthalmology & visual science, 52(5), 2417-2424. Cheung, L. K., & Eaton, A. (2013). Age‐related macular degeneration. Pharmacotherapy: The Journal of Human Pharmacology and Drug Therapy, 33(8), 838-855. Luo, Y. H.-L., & Da Cruz, L. (2016). The Argus® II retinal prosthesis system. Progress in retinal and eye research, 50, 89-107. SecondSight. (2021). SecondSight: Life in a New Light. Retrieved from https://secondsight.com/ Ho, A. C., Humayun, M. S., Dorn, J. D., Da Cruz, L., Dagnelie, G., Handa, J., . . . Hafezi, F. (2015). Long-term results from an epiretinal prosthesis to restore sight to the blind. Ophthalmology, 122(8), 1547-1554. Mills, J., Jalil, A., & Stanga, P. (2017). Electronic retinal implants and artificial vision: journey and present. Eye, 31(10), 1383-1398. Pouratian N., Yoshor D., & Greenberg R. (2019). Orion Visual Cortical Prosthesis System Early Feasibility Study: Interim Results. Paper presented at American Academy of Ophthalmology Annual Meeting.

  • Silent conversations | OmniSci Magazine

    Chatter Silent Conversations: How Trees Talk to One Another By Lily McCann There are so many conversations that go on beyond our hearing. This column explores communication between trees and how it might change the way we perceive them. Edited by Ethan Newnham, Irene Lee & Niesha Baker Issue 1: September 24, 2021 Illustration by Rachel Ko It’s getting brighter. A long, long winter is receding and warm days are flooding in. I’m not one for sunbathing, but I love to lie in the backyard in the shade of the gums and gaze up into the branches. They seem to revel in the weather as much as I do, waving arms languidly in the light or holding still as if afraid to lose a single ray of sun. If there’s a breeze, you might just be able to hear them whispering to one another. There’s a whole family of these gums in my backyard and each one is different. I can picture them as distinctly as the faces of people I love. One wears a thick, red coat of shaggy bark; another has pale, smooth skin; a third sheds its outer layer in long, stringy filaments that droop like scarves from its limbs. These different forms express distinct personalities. Gum trees make you feel there is more to them than just wood and leaves. There’s a red gum in Central Victoria called the ‘Maternity Tree’. It’s incredible to look at. The huge trunk is hollowed out and forms a sort of alcove or belly, open to the sky. Generations of Dja Dja Wurrung women have sought shelter here when in labour. An arson attack recently blackened the trunk and lower branches, but the tree survived (1). Such trees have incredibly long, rich lives. Imagine all the things they would say, if they could only tell us their stories. Whilst the ‘whispering’ of foliage in the wind may not have significance beyond its symbolism, there are other kinds of communication trees can harness. All we see when a breeze blows are branches and leaves swaying before it, but all the time a plethora of tiny molecules are pouring out from trees into the air. These compounds act like tiny, encrypted messages riding the wind, to be decoded by neighbours. They can carry warnings about unwanted visitors, or even coordinate group projects like flowering, so that trees can bloom in synchrony. If we turn our gaze lower we can see that more dialogue spreads below ground. Trees have their own telephone cable system (7), linking up members of the same and even different species. This system takes the form of fungal networks, which transfer nutrients and signals between trees (3). Unfortunately, subscription to this network isn’t free: fungi demand a sugar supply for their services. Overall, though, the relationship is beneficial to both parties and allows for an effective form of underground communication in forests. These conversations are not restricted to deep-rooted, leaf-bearing beings: trees are multilingual. A whole web of inter-species dialogue murmurs amongst the branches beyond the grasp of our deaf ears. Through the language of scent, trees entice pollinators such as bees and birds to feed on their nectar and spread their pollen (4). They warn predators against attacking by releasing certain chemicals (5). They can even manipulate other species for their own defence: when attacked by wax scale insects, a Persimmon tree calls up its own personal army by alerting ladybugs, who feed on the scales, averting the threat to the tree (6). Such relationships demonstrate the crucial role trees play in local ecosystems and their essentially cooperative natures. Trees can be very altruistic, especially when it comes to family members. Mother trees foster the growth of young ones by providing nutrients, and descendants support their elderly relatives - even corpses of hewn-down trees - through their underground cable systems. These intimate, extensive connections between trees are not so different from our own societal networks. Do trees, too, have communities, family loyalties, friends? Can they express the qualities of love and trust required, in the human world, for such relationships? This thought begs the question: Can trees feel? They certainly have an emotional impact on us. I can sense it as I lie under the gums. Think about the last time you went hiking, sat in a tree’s shade, walked through a local park. There’s something about being amongst trees that calms and inspires. Science agrees: one study has shown that walking in forests is more beneficial to our health than walking through the city. How do trees manage to have such a strong effect on us? Peter Wohlleben, German forester and author of The Hidden Life of Trees, suggests that happy trees may impart their mood to us (9). He compares the atmosphere around ‘unhappy’ trees in plantations where threats abound and stress signals fill the air to old forests where ecosystem relations are more stabilised and trees healthier. We feel more relaxed and content in these latter environments. The emotive capacity of trees is yet to be proven scientifically, but is it a reasonable claim? If we define happiness as the circulation of ‘good’ molecules such as growth hormones and sugars, and the absence of ‘bad’ ones like distress signals, then we may suggest that for trees an abundance of good cues and a lack of warnings could be associated with a positive state. And this positive state - allowing trees to fulfill day-to-day functions, grow and proliferate, live in harmony with their environment - could be termed a kind of happiness in its own right. This may seem like a stretch - after all, how can you feel happiness without a brain? But Baluska et al. suggest that trees have those too, or something like them: command centres, integrative hubs in roots functioning somewhat like our own brains (10). Others compare a tree to an axon, a single nerve, conducting electrical signals along its length (11). Perhaps we could say that a forest, the aggregate of all these nerve connections, is a brain. Whilst we can draw endless analogies between the two, trees and animals parted ways 1.5 billion years ago in their evolutionary paths (12). Each developed their own ways of listening and responding to their environments. Who’s to say whether they haven’t both developed their own kinds of consciousness? If we take the time to contemplate trees, we can see that they are infinitely more complex and sensitive than we could have imagined. They have their own modes of communicating with and reacting to their environment. The fact is, trees are storytellers. They send out a constant flow of information into the air, the soil, and the root and fungal systems that join them to their community. Even if we can’t converse with trees in the same way that we converse with each other, it’s worth listening in on their chatter. They could tell us about changes in climate, threats to their environment, and how we can best help these graceful beings and the world around them. References: 1. Schubert, Shannon. “700yo Aboriginal Maternity Tree Set Alight in Victoria.” www.abc.net.au , August 8, 2021. https://www.abc.net.au/news/2021-08-08/dja-dja-wurrung-birthing-tree-set-on-fire/100359690. 2. Pichersky, Eran, and Jonathan Gershenzon. “The Formation and Function of Plant Volatiles: Perfumes for Pollinator Attraction and Defense.” Current Opinion in Plant Biology 5, no. 3 (June 2002): 237–43. https://doi.org/10.1016/s1369-5266(02)00251-0.; Falik, Omer, Ishay Hoffmann, and Ariel Novoplansky. “Say It with Flowers.” Plant Signaling & Behavior 9, no. 4 (March 5, 2014): e28258. https://doi.org/10.4161/psb.28258. 3. Simard, Suzanne W., David A. Perry, Melanie D. Jones, David D. Myrold, Daniel M. Durall, and Randy Molina. “Net Transfer of Carbon between Ectomycorrhizal Tree Species in the Field.” Nature 388, no. 6642 (August 1997): 579–82. https://doi.org/10.1038/41557. 4. Buchmann, Stephen L, and Gary Paul Nabhan. The Forgotten Pollinators. Editorial: Washington, D.C.: Island Press/Shearwater Books, 1997. 5. De Moraes, Consuelo M., Mark C. Mescher, and James H. Tumlinson. “Caterpillar-Induced Nocturnal Plant Volatiles Repel Conspecific Females.” Nature 410, no. 6828 (March 2001): 577–80. https://doi.org/10.1038/35069058. 6. Zhang, Yanfeng, Yingping Xie, Jiaoliang Xue, Guoliang Peng, and Xu Wang. “Effect of Volatile Emissions, Especially -Pinene, from Persimmon Trees Infested by Japanese Wax Scales or Treated with Methyl Jasmonate on Recruitment of Ladybeetle Predators.” Environmental Entomology 38, no. 5 (October 1, 2009): 1439–45. https://doi.org/10.1603/022.038.0512. 7, 9. Wohlleben, Peter, Jane Billinghurst, Tim F Flannery, Suzanne W Simard, and David Suzuki Institute. The Hidden Life of Trees : The Illustrated Edition. Vancouver ; Berkeley: David Suzuki Institute, 2018. 10. Baluška, František, Stefano Mancuso, Dieter Volkmann, and Peter Barlow. “The ‘Root-Brain’ Hypothesis of Charles and Francis Darwin.” Plant Signaling & Behavior 4, no. 12 (December 2009): 1121–27. https://doi.org/10.4161/psb.4.12.10574. 11. Hedrich, Rainer, Vicenta Salvador-Recatalà, and Ingo Dreyer. “Electrical Wiring and Long-Distance Plant Communication.” Trends in Plant Science 21, no. 5 (May 2016): 376–87. https://doi.org/10.1016/j.tplants.2016.01.016. 12. Wang, Daniel Y.-C., Sudhir Kumar, and S. Blair Hedges. “Divergence Time Estimates for the Early History of Animal Phyla and the Origin of Plants, Animals and Fungi.” Proceedings of the Royal Society of London. Series B: Biological Sciences 266, no. 1415 (January 22, 1999): 163–71. https://doi.org/10.1098/rspb.1999.0617.

  • The Ethics of Space Travel

    < Back to Issue 3 The Ethics of Space Travel By Monica Blasioli 10 September 2022 Edited by Yvette Marris and Tanya Kovacevic Illustrated by Aisyah Md Sulhanuddin Next "That's one small step for man, one giant leap for mankind." Even without a hyphen next to that quote, people around the world will recognise it. The mere sentence can bring forth a flurry of emotions and thoughts - national pride, curiosity, nervousness, and even scepticism - but most will recognise them as the first words spoken by Neil Armstrong, the first man to walk on the moon, in July of 1969. Despite this, there are deeper considerations that need to be taken when discussing space travel than what first meets the eye. Just like on Earth, there are a number of health and environmental implications that should not be ignored in the flurry of excitement to explore the wonders of space. Not only are passenger safety and climate change areas of concern, particularly with constant and normalised space travel, but so are the ethics of monetising from experiences that can inflict so much damage. First and foremost, space exploration can foster communication and cooperation between countries. The National Aeronautics and Space Administration (NASA), an independent branch of the US federal government, involves countries such as Australia, Italy, Russia, France and Germany. NASA prides themselves on their international cooperation, celebrating their achievements in bringing together a global community of scientists to collaborate on space research and communication. And this is truly the reality! For over 64 years, NASA has successfully commercialised off the excitement surrounding space exploration, creating jobs across the globe (and in space), and sparking interest in science internationally through captivating space images, educational programs and videos, and even a clothing range at H&M! In particular, collaborative work and research conducted at the International Space Station (ISS) has been a major benefit to humans. Despite not even being on Earth itself, it has deepened the understanding of our home planet. Research has revealed how the human body reacts to increased exposure to radiation and how plants grow in space, enabling a better awareness of how plants grow on Earth, as well as how chemicals and materials react to low-gravity environments. In fact, without space research, we wouldn’t be able to comprehend some things we take for granted on Earth. For example, how the moon impacts the tides and how long a day lasts (and also what your personality traits are, if you buy into that stuff). However, there is always a dark side to the moon. The normalisation of space travel through its commercialisation could have devastating environmental impacts. On July 20 2021, Amazon founder Jeff Bezos took off to space in his New Shepard rocket, built by his own company, Blue Origin. For ten minutes and ten seconds. Bezos and his company celebrated this moment as the beginning of their vision for a future where space travel, along with citizens living and working in space, is normalised - and, of course, commercialised by his company. While we congratulate Bezos and his team, can we really rejoice in Bezos’ vision for the future knowing that the impacts for those back at home could be deadly? A 2010 study using a global climate model found that 1000 launches of suborbital rockets each year would produce enough carbon to change polar ozones by 6%, increase the temperature over the poles by one degree Celsius, and reduce polar sea ice levels by 5%. (1). And of course, the rockets could contribute to climate change. The vast amount of soot produced by spaceships yields the potential to further break down the Earth’s atmosphere, and more worryingly, even begin to break down the current untouched outer layers (2). Once again, these impacts make it difficult to justify Bezos’ plans to make paying for space travel a ‘norm’ in our lives. The precise impacts of this may be unknown, however, Karen Rosenlof, senior scientist from the Chemical Sciences Laboratory in the U.S. The National Oceanic and Atmospheric Administration, warns that releasing pollutants into spaces they have never been before never has positive outcomes (2). There seems to be little concern by Bezos about these effects and too much concern on monopolising from the endeavours instead. And this is only the beginning - the potential health disasters could be even worse. Just like Chris Pratt and Jennifer Lawrence in Passengers, we are not immune to a potential space-based disaster. For over 50 years, NASA’s Human Research Program (HRP) has been researching the impacts of space travel on humans - and trying to decrease the impacts on their astronauts. Many space radiation particles are more deadly than those on Earth, and more difficult to be shielded from, increasing the chance of cancer and degenerative diseases, such as cataracts (3). The usual radiation protective measures do not hold up, particularly when travelling further distances from Earth, to a planet like Mars, where the radiation exists at higher, deadlier levels (3). In fact, on a trip to Mars, three different gravity fields would be encountered, and passengers would need to readjust to Earth’s gravity when returning (3). This damages spatial orientation, coordination and balance, as well as causing acute space motion sickness in travellers, which can lead to chronic conditions (3). All in all, this is still only the beginning of space travel and the research surrounding it. There are still - quite literally - galaxies of information that still need to be uncovered, meaning humans don’t have all the answers yet. This reach to the stars may blind us to issues later down the line which still lack research - long term exposure to radiation, prolonged consumption of dehydrated “space” food, the change in gravity, and how all of these cumulatively will interact in the long term… the list goes on and on. Are further endeavours into space worth the impacts on our world and fellow humans alike? And all to further line the pockets already filled with billions of dollars? References 1. Ross M, Mills M, Toohey D. Potential climate impact of black carbon emitted by rockets. Geophysical Research Letters. 2010 December 28;37(24):1-5. 2. Pultarova S. The rise of space tourism could affect Earth's climate in unforeseen ways, scientists worry [Internet]. 2021 July 26. Available from: https://www.space.com/environmental-impact-space-tourism-flights 3. Abadie L, Cranford N, Lloyd C, Shelhamer M, Turner J. The Human Body in Space; 2021 February 3 [updated 2022 February 24]. Available from: https://www.nasa.gov/hrp/bodyinspace/ Previous article Next article alien back to

OmniSci Magazine acknowledges the Traditional Owners and Custodians of the lands on which we live, work, and learn. We pay our respects to their Elders past and present.

Subscribe to the Magazine

Follow Us on Socials

  • Facebook
  • Instagram
  • LinkedIn
UMSU Affiliated Club Logo
bottom of page