Tag

autonomous systems

WEC 2019 Day 2: Is the future of engineering human?

By | New technology and innovations | No Comments

Is the future of engineering human? It’s a big question, but a panel of experts had a go at finding an answer during the opening session of day two of the World Engineers Convention, happening now in Melbourne.

Moderator Jon Williams, Partner and Co-Founder of management consulting firm Fifth Frame, actually began the discussion with a question for the audience: How optimistic are you about the future of engineering? 

As the responses poured in (on a scale of 1-5, 5 being very optimistic, 1 being ‘we’re doomed’), it became clear that while there’s some uncertainty, many feel there will still be a place for engineers in the future. 

Panelists responded that the future of engineering is absolutely human, but all were in agreement that this will only be the case if the profession can shift in some fundamental ways.

John Sukkar.

Advances in technology are enabling new ways of working and thinking, said John Sukkar, Director – Engineering and Design for Data61. But rather than fear how technology might impact the role of engineers, he said the profession should see this as an opportunity.

“As we go through digital transformation, it’s not an elimination of jobs but rather a shift in skill sets,” he said.

Take manufacturing as an example: 50 years ago, it was a very labour-intensive industry, but today many processes are automated. As we move into Industry 4.0, the same shift is likely to happen for more professional services roles, like engineering, law and medicine.

Rather than be a threat to engineers or taking jobs, Felicity Furey, Co-founder of Power of Engineering and Director of Industry Partnerships at Swinburne University, said tech will amplify our abilities and allow engineers to achieve more with less.

However, she emphasised that as technology makes inroads into industry, the skills required to be a ‘good’ engineer will change. 

“We will need engineers with empathy, ethics, good communication skills, collaboration, creativity and a healthy dose of scepticism — you can’t get that from tech,” she said. 

“We need the left brain and the right brain, the art and science, to come together.”

Meredith Westafer, Senior Industrial Engineer at Tesla, agreed, and added that regardless of what the future brings, the core purpose of engineering will remain the same: solving problems for people. 

“What will change is how we do that,” she added.

As tech frees engineers from the more repetitive or mundane tasks, Westafer said they will be able to concentrate on interesting and creative work — “work with a purpose”. She added that this makes it imperative for organisations to start thinking about the message they broadcast to the world about what it’s like to work there.

“Being able to attract the best talent is doing something people want to do – it’s the message you send about why your organisation exists,” she said. 

She spoke of an experience that, based on the murmurs of agreement from the audience, is a familiar one for engineers: in school, engineers are encouraged to “think big”, but once they enter the workforce, they often become hobbled by processes and the ‘this is how it’s always been done here’ mentality. 

“Let creative engineers create if you want to retain them,” she implored. 

Automating ethically

As technology takes over more of these tasks, though, engineers face an ethical dilemma: if there is a gradual reduction in people’s involvement in more manual or repetitive tasks, do engineers have a responsibility to keep people in jobs — even if they don’t need to be there?

Meredith Westafer.

To answer this question, Westafer drew on her own experiences working to design Tesla’s Gigafactory, which does incorporate autonomous technology. 

“As someone who has installed a fair amount of automation, it’s important to understand things from the side of the people displaced by the technology,” she said. 

Crucial to this is thinking about the types of jobs replaced; many of them are mundane or dangerous jobs, jobs “we don’t want humans to be doing”.

“We have an ethical imperative to keep people safe,” she said. 

“If [technology] is replacing a good job, organisations are ethically bound to retrain people. I don’t think it’s immediately obvious that organisations should be taking care of that, but if you put the onus on the person whose job is being replaced, that’s just not right.

“We need to automate ethically.” 

What are the right skills?

Upskilling the current workforce is one thing. But what about for the next generation of engineers? If the future is uncertain, how can we prepare people today to deal with the challenges of tomorrow?

In her role at Swinburne, Furey said they took this question to industry and asked engineering companies what skills they need in their organisations. The answers surprised her.

“We thought for sure it would be technical skills, but actually they came back with skills like communication, collaboration, being able to influence people, even knowing how to write a good email,” she said. 

She believes that in order to teach these skills, students need to be working on real-world projects and solving real problems as part of their degree. 

Sukkar said cultivating the skills future engineers will need also means “encouraging and rewarding people for taking risks and thinking big”.

Furey agreed, and said ‘why’ will become the most valuable question an engineer can ask. Organisations need to encourage this behaviour, she added.

“Create psychological safety in your organisations to encourage people to take risks. Give people the freedom to fail,” she said. 

Felicity Furey.

As the role of engineers changes, all the panelists agreed that they are looking forward to seeing more engineers in leadership positions. According to Furey, 21 per cent of S&P CEOs come from engineering backgrounds, which is more than lawyers or accountants.

She said the skills required to be a great leader are changing, and engineers have an amazing opportunity to step up to the challenge.

“The top skill required to be a leader today is to empower people … it’s no longer about command and control, it’s about support and empower,” she said. 

It also swings back around to the ‘why’, Westafer said, and great engineering leaders need to actively encourage that in their organisations. For example, she said Tesla CEO Elon Musk pushes first-principles thinking for all their work.

“It’s not about building an electric car the way everyone else has built an electric car. We need to be asking ‘what is the real question we are trying to answer? What is the problem we are trying to solve?’ and then work from there,” she said. 

To find this mentality for future Tesla engineers, one question Westafer always asks during interviews is: If I have a manufacturing line that is 1000 m long, how big is the factory?

“If someone responds with ‘you haven’t given me enough information’, they aren’t hired,” she said.

“I’m looking for people who ask as many questions as they want. What are we optimising for? How many parts are there? What are we building? That’s the kind of thinking we are looking for.” 

At the end of the sessions, Williams polled the audience with the same question as at the start: How optimistic are you about the future of engineering? 

Perhaps luckily for all, and as a testament to the quality of the insights shared by the panelists, engineers came out of the discussion more optimistic that people have a place in the future of engineering than when they arrived. 

future of transport

How can connected transport help urban networks work in perfect harmony?

By | Engineering for humanity | No Comments

Automation is just one example of how technology is influencing the design of future transport to challenge our current understanding of urban landscapes.

A blue, blocky, mini bus shuttles its way around the suburban streets collecting waiting passengers, humming to a stop as it lets people on and off. The bus is depositing people safely and efficiently between homes, shops and transport hubs.

There’s no polite nod to the bus driver as passengers alight from their ride – because there is no driver on this bus. The bus is automated. It knows where to go, and it senses when it needs to stop to let a person safely past. It ‘speaks’ to other vehicles it meets along its path so they both know which way to go. This is the future for automated vehicles like those being trialled in the University of Melbourne’s Australian Integrated Multimodal EcoSystem (AIMES).

Transport nirvana

Automation is just one example of how technology is influencing the design of future transport to challenge our current understanding of urban landscapes. It is a future in which the peril of human distraction and its potential consequences have become a thing of the past. An effective transport system plays a vital role in making a city liveable, and is a key driver of competition in the global marketplace.

In this sense, AIMES is at the top of its game as a world-first living laboratory based on the streets of Melbourne, established in 2016 to test highly integrated transport technology in a real-world environment. AIMES has grand plans to deliver safer, more efficient and more sustainable urban transport outcomes.

Together with a team of transport experts, Professor Majid Sarvi, Director of AIMES, is developing overarching infrastructure to allow all road users (drivers, cyclists and pedestrians) to connect with each other and sense their greater environment for distributed cooperative cognition.

 

This shared thinking approach allows road users to detect congestion hot spots faster and keep traffic flowing better. It will also make our roads safer.

“It has been estimated that connected transport can reduce the economic cost of road crashes by more than 90 per cent. And best of all, such a system can learn, improve and evolve. We call this new technological capability ‘intelligent connectivity’,” Sarvi said.

Success factors

A key driver of AIMES’ success lies in its collaborative approach. AIMES is an evolving partnership of more than 50 domestic and international transport leaders from industry, research and government. AIMES partners share a passion to work together to solve today’s city mobility challenges.

AIMES’ is network of smart sensors connecting all parts of the transport environment within a six square kilometre grid on the streets of inner-city Carlton, Melbourne. AIMES provides a unique platform in a real-world environment for collaborative technology trials which integrate the movement of all road users (people and vehicles) with transport infrastructure.

The vision from the team behind AIMES is as simple as it is complex: connected vehicles, connected public transport, connected pedestrians and cyclists, and smart public transport stations.

Hopefully that same blue, blocky mini bus will soon greet you at the train station to offer you a safe, efficient and smart ride home.

See this world-first living laboratory in action as part of an offsite tour at the World Engineers Convention, held 20-22 November in Melbourne. To learn more and to register, click here.

autonomous robots and engineering

With autonomous robots on the rise, what do engineers need to know?

By | Preparing the next generation | No Comments

As collaborative robots give way to autonomous ones, the future is not as frightening as you might think, says Professor Elizabeth Croft, presenter at the World Engineers Convention.

When her daughter came home with a textbook that said robots are designed by ‘scientists’, Professor Elizabeth Croft was very surprised. Most of the driving force behind robot technology and capability is coming from engineers, she said.

“I had a bit of a fit when I saw what the textbook said. I told my daughter, ‘No, actually, engineering is pushing the forefronts of robotics. Science, art and design all contribute and help us to think about it, but the engineering part is what allows us to continue to innovate,” said Croft, Dean of the Faculty of Engineering at Monash University.

When Croft talks about the future of robotics, she’s not discussing the manned ‘collaborative’ machines that, for instance, help people on an assembly line to lift engine blocks into car bodies and that switch off when their operator is absent. She means fully autonomous robots.

“Collaborative robots, or ‘cobots’, were passive in the sense that they would not act unless the operator put motive force into them,” she said. They were very safe because they were not autonomous. If the operator did not touch the cobot’s controls, it would stop.

“Where we’ve moved is to a place where now we have autonomous robots that are independent agents, such as delivery robots, robots operating as assistants, etc.,” she said.

“This is the area that I focus on: robots that bring you something. Maybe they hand you a tool. Maybe they carry out parts of an operation that are common in a workplace. We’re interested in collaborating with those agents.”

These autonomous robots are different from cobots, Croft said, because they have their own agenda and their own intent. They are not tele-operated, and they are not activated or deactivated. They have their own jobs, just like people in the workplace. They need no permission to operate.

It’s in this area that Croft works, in the space where rules of engagement have to be figured out. Several major issues are slowing things down right now, such as questions around liability and safety frameworks. Also, how does the front-end work, or how do humans interact with the robot? How do they tell it what they want it to do? If voice operation is key, then we’re clearly not there yet, judging by the voice interactions with our smartphones.

And what about social and ethical impacts of technology in society? These are powerful, autonomous systems that are being developed, so how and where should boundaries be drawn to ensure Skynet doesn’t send a cyborg assassin to kill Sarah Connor?

“The underlying programming and bounding of how much autonomy those systems have really impacts what consequences can happen,” Croft said.

“So, it is very important that students of this technology think about ethical frameworks in the context of programming frameworks. Ethics must underlie the basic design and concepts around how an autonomous system operates. That needs to be part of the fundamental coding, part of the training of an engineer.”

Reducing complication

In order to tone down the Terminator imagery, Croft offers an example of how an autonomous robot might change workflow for the better.

When you buy a piece of furniture from IKEA, the instructions contain a small picture of a man and look friendly, but they’re actually quite complicated. There are numerous pieces, many just a little bit different to each other. Some are very small, some are very large, some are flexible. The assembly requires dexterity and making of choices about what must be done in what order. Constant close inspection is a must because of the numerous dependencies.

Professor Elizabeth Croft

Professor Elizabeth Croft.

“This job cannot be fully automated because it’s too expensive,” she said.

“But there are parts of that operation where it would make a lot of sense to have more automation or assistance involved.”

Such technology is very close to reality right now, but we don’t have the legal and other frameworks to make it fully operational.

“We’ve come to a place where people can grab onto a robot, move it around, show it an operation, then press a button and the robot does it,” Croft said.

“But because of legal issues, liability and occupational health and safety, there are risks that need to be managed. There are issues around getting the person and the robot to come together in a workspace in a safe way. Who’s responsible? When the operator is always in charge, then there’s no doubt. But when the operator has no longer got their hand on the big red button, then there is risk.”

Who assumes that risk? In Europe, Croft said, the risk is assumed mainly by the manufacturer of the robot, which creates a challenge for innovation. In North America, the risk is often assumed by the person or company that owns the robot. In other jurisdictions, the risk could be assumed by the worker who is using the robot.

Swapping robots with humans

Outside of the legal framework, the biggest issue is actually the workflow itself. On a typical production line for instance, if one worker can’t do a job, another is brought in to take their place. People are quickly interchangeable. The same needs to be true of a robot being replaced by a human. If the robot breaks down, the business can’t stop operating. So, humans and robots must be easily swapped in and out.

There also needs to be a clear understanding of the value being offered by the robot, to ensure the worker is comfortable to work with the robot. And the worker must feel that the robot understands what they do, too.

“It will become a greater and greater requirement for educators of people working in software engineering or computer engineering to create a real understanding of the impacts  – ethically, socially, environmentally – of the designs they create,” Croft said.

“We’ll need professionals interested in public policy and engineers with a strong ethical framework. The engineers are creating the future of technology. We are the ones who first see the potential impacts. If we don’t prepare our people for that, we’ll see unintended consequences of the technology.”

Elizabeth Croft will be speaking about how engineers can set the agenda for future technology implementation at the upcoming World Engineers Convention. To learn more and to register, click here