All Posts By

Engineers Australia

Meet Sophia, the humanoid robot that has the world talking

Sophia the humanoid robot is sure to turn heads when she makes an appearance in Sydney this September.

 

How much longer until we get machines with human-level capabilities? Discussions about time-frames and consequences can occasionally get heated, and nobody knows any proper answers, but every now and then an expert will take a punt.

In February Dr David Hanson, founder of Hanson Robotics, told the World Congress on Information Technology in Hyderabad that robots would be “alive and have full consciousness in five years” according to India’s The Economic Times.

Hanson is a PhD in interactive arts and engineering and a former Disney sculptor and researcher at the company’s Imagineering Lab. His Hong Kong-based robotics firm is the maker of lifelike, humanoid robots, most famously ‘Sophia’, who has made appearances on talk shows and conference stages around the world.

Despite progress by Hanson and others, not everybody believes truly convincing humanoid robots will be here in the near-term future.

“Part of the challenge is the ‘uncanny valley’, robots that are even close to humans in appearance and behaviour look eerie,” said UNSW Professor of Artificial Intelligence Toby Walsh.

“But AI has made great advances in the past few years, driven by more computer power, more data and advances in algorithms like deep learning.”

Sophia’s dialogue uses a basic decision tree, like a chatbot, integrated with other AI features for tasks such as governing expression and emotion recognition. Last year, Hanson’s Chief Scientist, Ben Goetzel, told Humanity+magazine that Apple’s Siri would probably be the nearest match to the company’s dialogue system, which also “seems to be a sort of complex decision graph, which on the back end can draw on a variety of different sources”.

He acknowledged it is not artificial general intelligence (AGI) but told The Verge, it is “absolutely cutting-edge in terms of dynamic integration of perception, action, and dialogue”.

Sophia is arguably an impressive feat of several different engineering disciplines, right down to her patented ‘Frubber’ emulating human skin, and described as a “spongy elastomer using lipid-bilayer nanotech, self-assembled into humanlike cellwalls”.

For whatever her advancements or shortcomings, Sophia and other androids made by Hanson Robotics are helping drive a necessary conversation about what should and shouldn’t be built, and why or why not. The ethics around recognising human qualities in robots were front and centre after Saudi Arabia, as a publicity stunt, granted citizenship to Sophia last October.

Sophia will be appearing at this year’s Australian Engineering Conference to discuss robot rights.

If a robot hurts or kills someone, is the robot responsible?

As robotics and AI become an increasingly potent force in society, previously abstract questions about how we should legislate them now need concrete answers.

 

As self-driving vehicles take to the roads, and organisations and governments continue to invest in collaborative robots to work autonomously alongside humans, we need to decide who should be responsible for the decisions that robots make.

An attempt to address this conundrum was made last year by the European Parliament, which passed a resolution suggesting robots be granted ‘legal status’. But recently, members of the European Council (responsible for defining the EU’s overall political agenda) and others have penned an open letter on the subject.

It strongly cautioned against granting robots legal rights, and suggested that proponents of legal status for robots might have ulterior motives for laying responsibility at the feet of machines, rather than their manufacturers.

What is the resolution?

 

The resolution was passed last year, when the European Parliament voted to grant legal status to ‘electronic persons’. Drafted by MEP and Vice-Chair of the European Parliament’s legal affairs committee Mady Delvaux, the resolution aimed to create a set of unified laws to prepare European countries for the entry of AI and robotics in everyday activities, and address concerns that autonomous machines might cause harm to their human counterparts.

Lawmakers called for legal recognition of robots as a way to hold them accountable for damage they might cause, particularly to clarify liability laws surrounding self-driving cars.

“At least the most sophisticated autonomous robots could be established as having the status of electronic persons with specific rights and obligations, including that of making good any damage they may cause, and applying electronic personality to cases where robots make smart, autonomous decisions or otherwise interact with third parties independently,” the resolution stated.

Proponents of the resolution have been quick to clarify the legal status of robots would be similar to laws that give businesses legal rights of individuals, allowing them to sign contracts or be sued, but would not give them human rights.

Science fiction versus fact

 

The open letter, which was been signed by AI thought leaders, experts and CEOs around the world, raised a number of concerns its signatories have about the resolution.

Firstly, the resolution speculates that robots might have the autonomy to make complex choices and even make mistakes – an assumption that drastically overestimates their abilities.

“From a technical perspective, this statement offers many biases based on an overvaluation of the actual capabilities of even the most advanced robots,” the letter stated, including “a superficial understanding of unpredictability and self-learning capacities, and a ‘robot’ perception distorted by science-fiction and a few recent sensational press announcements.”

Indeed, the European Parliament’s resolution begins with references to Mary Shelley’s Frankenstein, the myth of Pygmalion and other “androids with human features”.

Proponents of the Council’s open letter have also pointed to Sophia, the humanoid robot who was granted citizenship from Saudi Arabia.

Noel Sharkey, co-founder of the Foundation for Responsible Robotics and one of the letter’s signatories, expressed his concerns about the impact of ‘show robots’ like Sophia on law and policy makers.

“It’s very dangerous for lawmakers. They see this and they believe it, because they’re not engineers and there is no reason not to believe it,” he said in an interview with Politico.

An out for manufacturers?

 

Signatories of the letter suggested granting legal status to robots would ultimately serve manufacturers looking to absolve themselves from blame in the event of an accident.

“By adopting legal personhood, we are going to erase the responsibility of manufacturers,” said Nathalie Navejans, a French law professor at the Université d’Artois and one of the letter’s architects.

However, while the Council opposes the EU’s proposal, the open letter advocates for  “unified, innovative and reliable laws” to regulate AI and robotics, especially as more semi-autonomous and autonomous robots are likely to hit the market in coming years.

Mikaela Dery

Mikaela is a staff writer and recent philosophy graduate. Her thesis looked at the ethical implications of AI and its potential as a force for good. She is now only a little bit scared that robots will take over the world.

AI for kids: New book hopes to get young readers interested in the future of tech

With robotics and AI changing our world at a rate of knots, kids need to get a handle on these concepts as soon as possible, according to Professor Michael Milford.

 

Milford, a professor of robotics from Queensland University of Technology, said early education in AI and automation is important, as these will be dominant factors affecting the employment and adult lives of today’s children.

With this in mind, he penned a kid-friendly book to spark early interest and understanding of the concepts, titled The Complete Guide to Artificial Intelligence for Kids, for which he recently ran a successful crowdfunding campaign.

“Nobody’s going to work in the same job or career for 50 years. The people who will prosper and flourish will be those who can learn new skills and adapt,” he said.

And in order to adapt to their brave new world, children need to be able to understand the technology that will drive it – and their cars – in the not too distant future.

Milford has raised enough funds to produce 1500 to 2000 copies of his guide, and will give some away to schools in low socio-economic areas.

“[These kids] will be the ones who are most affected by technological advances, so it’s important they are informed,” he said.

Learning how to learn

 

While coding is often mentioned as the must-have skill for the emerging job market, Milford told create digital he believes an awareness and general understanding of AI and its applications is more important, as intelligent machines might eventually write their own code.

If this sounds like a big ask, Milford draws a comparison with the basic knowledge that most people have about cars. It isn’t difficult to work out what’s wrong when you come to a stop after passing one fuel station too many, or if there is a loud bang and your ride suddenly gets bumpy.

To help kids get up to speed, Milford said the earlier they get started, the better. He has aimed his guide at a primary school audience, although he has also received positive comments from older readers.

“Lots of adults say they really like it. Some are starting from nothing with their knowledge in the field,” he said.

Milford has designed the illustrated guide to gently introduce the concepts of AI to children, making sure they have fun in the process.

He has also given the material a test drive with younger readers (including his own preschool-aged children).

“Older kids will pick up everything, while younger kids might understand simple concepts, such as robots having brains, or being able to help doctors,” Milford said.

To help kids get up to speed on the wide range of developing technology, Milford has also created other STEM-based story books and guides. His upcoming offering, Rachel Rocketeer, is about a young, female version of Elon Musk who tests rockets and sends them to Mars.

Milford plans to open his Kickstarter campaign for Rachel Rocketeer in the next couple of months.

Keeping up with the rest of the world

 

The government is working to improve STEM education in the wake of reports that place Australia’s science and maths school performance for primary students below the top third of OECD nations.

According to Milford, although Australian STEM performance isn’t getting worse, we are being overtaken by other countries. And that makes it harder for us to compete in the global landscape.

Milford said a cultural shift is needed, where pride and awareness of our technological and academic achievements is ingrained into the national psyche.

“In Boston, everyone in the city knows what the great achievements of MIT and Harvard are. In Australia, it’s a very different cultural environment.”

Milford said demystifying technological concepts for children early, through resources like his guide, will help with cultural change.

“If we can get guides like this into the hands of young kids, they’ll be aware of these things and that’s half the battle. And gentle, fun education on key concepts will help them prepare for detailed education later,” Milford said.

Professor Michael Milford.

Nadine Cranenburgh

Nadine Cranenburgh is an electrical engineer with postgraduate qualifications in environmental engineering, and professional writing and editing. She works as a freelance writer and editor specialising in complex topics that draw on her experience in the engineering, local government, defence and environment industries.

In the age of digital warfare, is Australia’s cybersecurity ready?

The newest combat domain is cyberspace, and Australia needs to do more to protect its digital borders and assets from threats.

 

As an arena of war, cyberspace is new, but the way war is executed in that arena is as old as war itself. There is attack, defence, deterrence and influence. The basics simply don’t change.

“These activities have been around since biblical times,” said Major General Marcus Thompson, an electrical engineer who now heads up Information Warfare Division in the Australian Defence Force (ADF) .

“What’s new, of course, is the conduct of these activities in cyberspace, in this relatively new operating domain. So a lot of what we do is adapting existing military tactics, techniques and procedures to this new war-fighting domain, and to the new technology that we have available to us.”

He might be an engineer, but Thompson sees his role as one of translation and interpretation. As a professional military officer who has deployed on many overseas operations and who commanded at every level from Troop to Brigade, Thompson said he now has the responsibility of ensuring everybody in his team understands each other’s needs.

“My role is two-way, helping engineers understand the requirements of the combat force and helping the combat force understand the capabilities that engineers bring,” he said.

“The other part is frameworks and systems. This is a key part of an engineer’s skillset, bringing some order to what can otherwise be a very complex space. I’m forever encouraging the technical staff to make the complex simple. We need to express complex terms, complex techniques and complex technology in ways that the audience understands. That’s especially important in Canberra.”

Speaking of understanding, what exactly is ‘information warfare’? It is the integration of technical and non-technical capabilities in the information environment, Thompson says. Technical capabilities include cybersecurity and electronic warfare. The non-technical side includes such areas as intelligence and information operations.

“When I talk about integrating those capabilities, it’s the synchronisation, the integration and the co-ordination of technical and non-technical information capabilities, but also the integration of those capabilities with other kinetic and non-kinetic effects to achieve a specific outcome,” he said.

“That outcome could be strategic, it could be operational, or it could be tactical. That outcome might end up being the delivery of a weapon system or, in my language, a ‘loud, orange effect’. It might also be to achieve some influence, without the requirement to deliver a weapon.”

The recognition of cyber as a war-fighting domain doesn’t mean it is a domain unto itself, completely separate from the land, air, sea and space domains. In fact, Thompson said most fighting goes on at the intersection of those domains.

“As an Army officer, I’ll often take the mickey out of my Air Force and Navy friends by saying, ‘I’m really glad you’re here, as it would be a long swim without you.’ But it really is a team effort. Everything from bombs delivered by fast jets to naval gunfire support to intelligence that might come from a submarine or an aircraft, all of that comes together to create success.”

Defending our networks

 

While cyber as an avenue for attack is undeniably interesting, the defence of our systems is far more important, Thompson says. It’s a far greater challenge, so deserves greater time and effort. The act of resisting attack can be broken up into three main areas. The first is protecting yourself and fellow personnel.

“Self-defence is everyone’s responsibility. It is cultural, it is about awareness. It’s that, ‘Don’t be the person to click on the link in the phishing email’ piece, from a cyberspace operations perspective,” he said.

“What are you posting online? How do you keep yourself, your mates and your family safe in cyberspace?”

A non-military example is the recent news about the dangers of fitness apps such as Strava. It’s great that you do regular exercise, but do you really need to show the world exactly where you run and when you run?

The next area of defence is passive defence, the domain of network operators and communicators. This is the piece where he says you make sure you have adopted all of the best practice recommendations for your system.

“With these measures you might stop 95 per cent of attacks,” he said.

“Are your patches up to date? How many people have administrator rights? Are you monitoring your network or your mission system sufficiently closely that you notice anything unusual? As an electrical engineer, it’s what I would refer to as ‘basic network hygiene’.

“For us, it’s not just about computer networks. With modern weapon systems, even some of the scopes on sniper rifles have IP addresses! So we can talk about a ship or a plane or, goodness me, a rifle, all being part of the military internet of things. That’s why I talk about both network and ‘mission systems’.”

Number three is active defence. This is about being active within networks in such a way that you quickly identify and respond to things that shouldn’t be there.

Major General Marcus Thompson, who heads up the Australian Defence Force’s Information Warfare Division.

Trends in cyber warfare

 

Major trends in the cyber world right now are around the understanding of the arena, as opposed to the technology in the arena itself.

“What I am seeing and sensing in terms of trend is a greater acceptance of the risk of attacks in the information domain. It’s now accepted, and that’s not just a military view,” Thompson said.

“We’re also seeing throughout Australian society, and certainly into the commercial domain, a greater understanding of the information environment and especially cyberspace. The third observation I’d make in terms of something you might list as a trend is some confusion and conflation of our terminology and our collective approaches, particularly to cyber defence and cybersecurity.”

That is partly why Thompson talks about cyber defence in the three areas of ‘self-defence’, ‘passive defence’ and ‘active defence’, he explained. It’s simply to bring some structure to the conversation.

“There are some folk who might have a virus checker on their personal computer, and some folk who want to go straight to more advanced techniques that are probably the exclusive domain of government,” Thompson said.

 

Change is guaranteed

 

Thompson is looking forward to the Australian Engineering Conference (AEC) to be held in Sydney in September, where he will be a keynote speaker on cyber and space security. In such a fast-moving industry, he said a new development the day before his speech could change it completely.

“At the AEC 2018 I’ll certainly speak as an engineer about the role of engineers in this space,” he said.

“But also, and perhaps more importantly given my role, I’m keen to let people know what the Australian Defence Force is doing in terms of the development of our information warfare capabilities. Then I’ll be discussing the opportunities that might exist for industry and academia to partner and work with us.”

Those capabilities are being developed to engineer an advantage. If young Australians are being put in harm’s way, Thompson said, the ADF wants to give them the very best tools to survive and succeed.

“Part of that is understanding what the future operating environment might be,” he said.

“Part of it is understanding and analysing potential threats and designing capabilities to generate a relative advantage.” 

Trust in engineering is more important than ever – here’s why

The future probably won’t be a dystopian nightmare, but it feels like the world is dangerously close to the edge. Engineers will be crucial to keeping us from tipping over, says Professor Elanor Huntington, Dean of the College of Engineering and Computer Science at ANU.

 

Civil engineers will soon no longer need to spend time calculating stresses and strains on the girders that hold up railway tunnels because they’ll have powerful computers to do that. Instead they’ll use their knowledge to figure out complex transportation engineering challenges, such as how they might reliably transport a large number of people across a city, in a short period of time.

But play this tape forward 10 or 15 years, says Professor Elanor Huntington, Dean of the College of Engineering and Computer Science at ANU, and the picture will be very different.

“You’re going to have collections of civil engineers whose task is to figure out how to get three million people home safely from one side of a city to another in half an hour,” she said.

“At the same time, teams of electrical engineers will be figuring out how to achieve dynamic optimisation of delivery of electricity, not just to a particular railway station, but to the railway network, possibly the whole city, and maybe even the entire continent.

“Also, at the same time, the digital healthcare people will be trying to figure out how to achieve dynamic optimisation of in-home healthcare based on Twitter usage, and numerous other engineering disciplines will also be facing their own related issues. And that’s what we need to realise about the future; all of the challenges will be somehow related.”

Huntington, who is taking part in a panel discussion at the Australian Engineering Conference (AEC) 2018 on the topic of ‘Engineers as part of the great leap forward’, is most interested in how all of this ties together.

“Given that the challenges are all highly interconnected, we haven’t even begun that conversation yet,” she said.

It’s not a dystopian, fictional future that she’s talking about, Huntington insists. It is a very real future, the cusp of which we’re already standing on and for which the ANU College of Engineering and Computer Science is reshaping its offerings.

The college is beginning to educate people towards the way engineering will look in 2050, she said. This will be an industry in which various engineering specialisations will have to come together within a far bigger system.

“It is going to change the way all types of engineers work,” Huntington said.

“Think about the example of the civil engineers working on the railway. The changes touch on the way we currently think of civil engineering, they touch on what we currently think of as electrical engineering, they touch on what we currently think of as software engineering and so much more.

“It’s a bigger system because these and other engineering disciplines will all interact with each other. We’re going to need people who can go deep into any one of these disciplines but who can also pull all the way back up and understand how they all connect.”

These all-seeing, all-knowing engineers are known as ‘systems engineers’, she says, and of course they have been in existence for a long time. But the new breed of systems engineers will find themselves working in an entirely different realm to the one they know today. They will be tasked with creating a workable system out of numerous components including civil, electrical, software, social media, human and more.

“A decade from now things are going to look really quite different because we’re going to be actively prosecuting this train of thought around the convergence of previously disparate disciplines,” she said.

“That is our focus, the convergence of disciplines. Basically, at ANU we’re going to build a college that is custom designed for the middle of the 21st century.”

Disruptive technologies

 

Huntington has watched this change coming during a career spent mostly in the academic field. She also spent some time working with what was then known as the Defence Science and Technology Organisation and has been called in to organisations and onto projects as a professional consultant, particularly in the area of laser safety. Her research focus is very much on the future, specifically around experimental quantum optics.

“That sounds very fancy, but basically, I do research that looks into the technology of quantum communications and computing,” she said.

It’s a field that will change the way engineers work as it will offer faster and more powerful computing and communication systems. Quantum communication technologies, for so long an exciting idea that was near impossible to put in to practice, have been in use for specific applications for over a decade.

In 2007, a Swiss election was held with the assistance of the quantum cryptography, and in 2016, China launched a satellite that utilises ‘quantum key distribution’ which offers unconditional communication security between the satellite and teams on the ground.

“I work in the grey area between physics and engineering, in demonstration-of-principle experiments, which require large teams of researchers,” Huntington said.

“My research is all about trying to develop technology that will allow those ideas to escape into the wild and be commercialised.”

Elanor Huntington.

Are engineers slow to innovate?

 

There has long been a belief that engineers are slow to innovate and that, as a result, the future will come as a rude shock. Huntington says this point is a complex one. The motivation to stick to the tried and true comes not from a lack of innovation but instead from the simple fact that engineering, when it’s boiled down, is all about trust.

“Engineers are driven by a couple of competing sets of motivations,” she said.

“One is that we need to solve complicated design problems and to do this we are motivated by coming up with a creative, good solution. But that ‘good solution’ includes delivering on one of the fundamental promises of engineering: trust.

“When you drive across the Sydney Harbour Bridge, you trust that the bridge is not going to fall down. If you look at what’s been going on, particularly in the tech world recently, one of the things that is causing a lot of pain is the erosion of trust.”

This means engineers are constantly dealing with competing demands. The urge to create an innovative, high-quality solution must always be balanced out by the fact that the solution must do no harm. That leads to a tension between creativity and conservatism, and this is unlikely to ever change.

What will change is the way that various engineering disciplines will converge, meaning the safety, quality and security aspects of each will collectively be required to meet certain standards. Engineered software will be held up to the same safety and quality standards as physical constructions, for instance.

“We actually need to change the conversation a little bit around trust because we now live in a world of highly-connected heterogeneous systems,” Huntington said.

“They’re all distributed, they’re all different, but they’re also all deeply interconnected. The old ways of guaranteeing safety and trust don’t work so well. We need to completely rethink what trust looks like in this new world.”

Professor Elanor Huntington will be taking part in a panel discussion at this year’s Australian Engineering Conference in Sydney.