Skip to main content
The Institution of Engineering and Technology iet.tv
Site name
  • Videos
  • Channels
  • Events
  • Series

Access and Account

Access your personal account

Log in to see your favourites, lists and progress.

IET Login

Access via institution

Not currently connected to any institutions

Connect via

  1. Videos
  2. Video

AI and human digitisation: when seeing is not believing?

  • WhatsApp
  • Facebook
  • Email
  • LinkedIn
  • Bluesky
CPD This content can contribute towards your Continuing Professional Development (CPD) as part of the IET's CPD Monitoring scheme.
Live
  • Duration: 1 hr 26 mins
  • Publication date: 17 Sep 2018
  • Part of series IET Prestige Lecture Series, EngTalks - FKA The Kelvin Lecture Series

Abstract

Human digitisation – The how, the benefits and the pitfalls

Is the next generation inspired to push the boundaries in the future?

Are they given a chance to show their creativity and bring engineering and the arts together?

From the latest smartphones to advances in supercomputing, the visual effects technology behind today’s digital age is rapidly changing.

Ubiquitous information and communication are transforming our lives and revolutionising not only how we work, but how we see and understand the universe.

AI, voice recognition, deep learning and deepfakes – Dr Li will cover it all and more…

Taking human digitisation to the next level

When the first photorealistic computer-animated feature film 'Final Fantasy: The Spirits Within' was released in 2001 it was heralded as the movie which could prove to be the death knell of the actor.

The lead character, Aki Ross, was designed to be as realistic as possible; the now defunct studio, Square Pictures, intended for the CGI character to be the world's first artificial actress to appear in multiple films in multiple roles.

This great promise was not realised and audiences were unconvinced. Despite significant advances over the 17 years which have followed there is still a feeling that true human digitisation is some way off - surely we will always know when an image of a person is genuine or not?! How can better quality be achieved? What are the potential pitfalls of this technology and what might the benefits and applications be?

In his EngTalk Dr Hao Li will address his work on photorealistic human digitisation and rendering using deep learning. It is an original method for animating a digital avatar in real-time based on the facial expressions of an head-mounted display user.

His approach achieves higher fidelity animations than can be achieved using existing methods, and requires no user-specific calibration. His approach regresses images of the user directly to the animation controls for a digital avatar, and thus avoids the need to perform explicit 3D tracking of the subject’s face, as is done in many existing methods for realistic facial performance capture.

His system demonstrates that plausible real-time speech animation is possible through the use of a deep neural net regressor, trained with animation parameters that not only capture the appropriate emotional expressions of the training subjects, but that also make use of an appropriate psychoacoustic data set.

Hao Li will also address ethical dilemmas linked to his research. Artificial intelligence video tools make it relatively easy to put one person’s face on another person’s body with few traces of manipulation. So, called deepfakes are one of the newest forms of digital media manipulation, and one of the most obviously mischief-prone.

It’s not hard to imagine this technology’s being used to smear politicians, create counterfeit revenge porn or frame people for crimes. “I see this as the next form of communication,” he said in interview with New York Times. “I worry that people will use it to blackmail others, or do bad things. You have to educate people that this is possible.

Keywords:
  • 3D
  • AI
  • EngTalk
  • EngTalks
  • algorithm
  • animation
  • artificial intelligence
  • deep learning
  • deepfakes
  • digital avatar
  • digital media manipulation
  • digitisation
  • human digitisation
  • machine learning
  • smartphones
  • supercomputing

Channels

Prestige Lectures

Prestige Lectures

IT

IT

Speakers

  • Dr Hao Li

    Dr Hao Li

    University of Southern California , Institute for Creative Technol, Director of Vision and Graphics Lab

    Dr Hao Li - is best known for his work on dynamic geometry processing and data-driven techniques for making 3D human digitization and facial animation accessible to the masses.He worked on the famed digital re-enactment of Paul Walker in the movie Fast and Furious 7 and his research has led to the facial animation technology in Apple’s iPhone X.He is the Director of Vision and Graphics Lab, Institute for Creative Technologies (ICT) at the University of Southern California (USC). He is also an assistant professor at USC’s Viterbi School of Engineering Computer Science Department. His research interests include human digitization, performance capture, facial animation, hair capture, 3D scanning, deep learning, data-driven techniques, geometry processing, virtual reality and augmented reality.Hao Li describes himself as a German born punky of Taiwanese descent doing computer graphics. Hao knows, #epicfail, but also is a tenure-track assistant professor, fashion mödel, and zombie at the Computer Science Department of the University of Southern California (USC) in Los Angeles, CEO of Pinscreen, a well-funded startup bringing AR to mobile communication, as well as the director of the vision and graphics lab of the USC Institute for Creative Technologies.His algorithms on dynamic shape reconstruction and non-rigid registration are widely deployed in the industry, ranging from leading visual effects studios to defence projects and manufacturers of state-of-the-art radiation therapy systems. Hao’s work on depth sensor-driven facial animation has also led to the Animoji feature on Apple’s iPhone X.He received the Office of Naval Research (ONR) Young Investigator Award in 2018, Google Faculty Research Award, the Okawa Foundation Research Grant, and the Andrew and Erna Viterbi Early Career Chair in 2015. He has been named one of the world’s top 35 innovator under 35 by MIT Technology review in 2013 and NextGen 10: Innovators under 40 by C-Suite Quaterly in 2014 and ranked #1 in 2016, on the Top 10 Leaderboard of Computer Graphics research for the past five years by Microsoft Academic. Hao is frequently appears in the media (CBS News, ABC News, Das Erste ARD, ABC Australia), the NY Times, and in the LA Times (2015, 2018)!
  • Dr Sarah Atkinson

    Dr Sarah Atkinson

    King’s College London

    Dr Sarah Atkinson is Head of Department of Culture, Media & Creative Industries, King's College London and co-editor of Convergence: The International Journal of Research into New Media Technologies.Sarah has published widely on the impacts of digital technologies on film & cinema audiences and film production practices.Sarah has undertaken extensive work into the Live Cinema economy and is currently working on a number of immersive media projects including a Virtual Reality diversity initiative, a project which explores artificial intelligence and conversational interactivity in games and ‘XR Circus’ which brings together circus artists with immersive technologists.Throughout history during the introduction of every new media - from photography to the world wide web - there has always been a period of uncertainty and confusion as audiences attempt to comprehend the blurred line between their reality and fiction. Sarah’s talk provides the context for the recent emergence of ‘deepfakes’ - a phenomenon that follows on this very same media continuum.As media history has taught us - within this transitional moment - and through continued exposure to the new media – over time audiences become increasingly literate and are able to effectively discern the boundary between fact and fiction. However, as Sarah’s talk illuminates, due to the exponential transition to this latest phenomenon and the rapidity of technological innovation, it has become ever more challenging to educate audiences, where the distinction between what is real and what is fake becomes seemingly inconceivable.
  • Yewande Akinola

    Yewande Akinola

    Yewande (pronounced Yeah-wan-day) is a chartered engineer, an innovator, a dreamer and speaker with tons of passion for the role of innovation, creativity and engineering in our world today. She holds a Bachelors degree in Engineering Design and Appropriate Technology from the University of Warwick and a Masters in Innovation and Design for Sustainability from Cranfield University. Yewande’s engineering experience and responsibilities include the design and management of sustainable water supply systems in the built environment. She has worked on projects in the U.K., Africa, the Middle East and in East Asia. Yewande has been awarded the U.K. Society of Public Health Engineers’ Young Engineer of the Year. She won the 2012/2013 the U.K. Young Woman Engineer of the Year (Institute of Engineering & Technology) and has been honoured with the Exceptional Achiever Award by the Association for Black Engineers (AFBE-UK) and the Association of Consultancy and Engineering, U.K. (ACE). She was named one of the UK’s top 35 women under the age of 35 by Management Today in 2013. In 2014, She was won the PRECIOUS AWARDS Outstanding Woman in STEM award. She has presented television programs for Discovery Channel, Channel 4, Yesterday TV and CBEEBIES.She is enthusiastic about sharing her life and engineering experiences with young audiences and dedicates a proportion of her time to speaking in schools. She participates in global forums on world challenges and global education, contributing to panel discussions on the next steps for developing and implementing solutions for our world’s grand challenges. In her spare time, she enjoys building models, settling into a good book, dancing, writing, spending time with family and friends and exploring new places.
The Institution of Engineering and Technology iet.tv

Address: Futures Place, Kings Way, Stevenage, SG1 2UA

Telephone: +44 (0)33 049 9123

Email:  iet.tv@theiet.org

© 2026 The Institution of Engineering and Technology.

The Institution of Engineering and Technology is registered as a Charity in England & Wales (no 211014) and Scotland (no SC038698). Futures Place, Kings Way, Stevenage, Hertfordshire, SG1 2UA, United Kingdom

  • LinkedIn
  • Instagram
  • YouTube
Privacy statement Cookie Preferences Accessibility About us theiet.org Help

Powered by Cadmore Media

Embed Code

<script type="text/javascript" src="https://play.cadmore.media/js/EMBED.js"></script> <div class="cmpl_iframe_div"> <iframe src="https://play.cadmore.media/Player/eee2aef2-be54-4736-aecd-17cf4231c68c" scrolling="no" allowtransparency="true" allowautoplay="true" frameborder="0" allow="encrypted-media;autoplay;fullscreen" class="cmpl_iframe" allowfullscreen="" style="overflow: hidden;border: 0px; margin: 0px; height: 100%; width:100%;"></iframe> </div>

Are you sure you want to reset your password?

If so, you will be redirected to the Authentication Service

Title

Prompt