Categories
Past events Uncategorized

22 November 2022: Talk by Sven Mayer, LMU Munich

The Next Generation of Computing Systems: Artificial Intelligence meets Humans

Abstract: Traditionally, humans used machines to support them while staying in full control. However, over the last decades, we have moved away from this paradigm; today, systems support users beyond their capabilities. For instance, smartphone text entry uses prediction algorithms to improve human typing performance. Here, the human-in-the-loop approach guarantees that the final result aligns with the human expectation. Yet, with more autonomous systems entering our daily lives, such as self-driving cars, the question becomes how and to what extent users should stay in control. This is crucial as handing over control creates a tradeoff between easing the users’ burden and aligning with their expectations. Thus, the next generation of computing systems needs to provide intervention methods as they relieve the user through artificial intelligence and automation. In this talk, I will highlight how my research sits at the intersection between Human-Computer Interaction and Artificial Intelligence, allowing humans and machines to learn and accomplish tasks together.

Bio: Sven Mayer is an assistant professor of computer science at LMU Munich (Germany). His research sits at the intersection between Human-Computer Interaction and Artificial Intelligence, where he focuses on the next generation of computing systems. He uses artificial intelligence to design, build, and evaluate future human-centered interfaces. In particular, he envisions enabling humans to outperform their performance in collaboration with the machine. He focuses on areas such as augmented and virtual reality, mobile scenarios, and robotics.

Categories
Events Past events

26 May 2022: Talk by Young-Woo Park

On 26 May 2022, A/Prof Young-Woo Park of UNIST gave a talk for the SydCHI community. This talk was co-organised by ACM SydCHI, Materialising Memories research program (led by Prof van den Hoven), and the UTS Visualisation Institute (as part of our regular talk series).

Exploring Self-reflective Experiences through Tangible Interaction with Personal Lifelog data

Abstract

People create and consume digital possessions such as photographs, digital music, videos, posts, texts, and documents through smart devices in their daily lives. This behavior has caused an increase in the types and amount of individuals’ accumulated data over time, and storing everything “just in case” has made it difficult for people to organize and retrieve the data. Although cloud services and AI make the revisiting experience easy to navigate for particular themes, digital possessions’ intangible and indiscriminate nature still affects the limited experiences revisiting individuals’ past data. Our research team has investigated ways to provide new designs and interactions using data-driven technologies for meaningful everyday activities, which link to enriching self-reflective experiences. At this point, we utilized everyday objects to show and provide tangible interaction with personal lifelog data to support new experiences in daily life. This seminar will introduce examples of how people perceive and interact with personal lifelog data (e.g., photos, music listening history, schedules, and reading activity) through everyday artifacts in homes.

Bio

I am an Associate Professor of Design at Ulsan National Institute of Science and Technology (UNIST) in South Korea and leading Interactive Product Design Lab. As an interaction design researcher, I was Program Chair of ACM TEI’22 and Papers Program Committee for ACM CHI’19 & ’20, DIS’18 & ’20~’22, along with South Korea’s HCI and Design community. My research highlights the significance of ‘physical richness’ during interaction with technologies by utilizing personal data as a material for designing interactive artifacts enabling tangible exploration of digital archives in everyday lives.

Categories
Events Past events

14 April 2022: Talk by Mark Billinghurst

Please save the date and join us for our joint UNSW – SydCHI talk by Prof. Mark Billinghurst.

You can register your interest to attend via the Eventbrite page.
Please share this with your students, colleagues, or anyone interested in this topic!

Date

Thursday, 14 April 2022, 10:00 AM-11:00 AM

Where

Physical location: Room#113, K17, UNSW Sydney

Online: via MS Teams, with limited interactions (Link will be shared later).

Empathic Computing: Creating Shared Understanding

Abstract

This talk provides an introduction to Empathic Computing, a new approach to creating shared understanding between people. Communication is a basic need for people, and over thousands of years a wide range of technologies have been used to enhance face to face and remote collaboration. However, the recent COVID pandemic has shown how limited these technologies are, with current remote collaborative experiences being far from face-to-face conversation. Empathic Computing is a new approach to collaboration that combines elements of Natural Collaboration, Experience Capture and Implicit Understanding. This talk provides an overview of the topic and then gives examples of research from the Empathic Computing Laboratory and others.

Bio

Mark Billinghurst is Director of the Empathic Computing Laboratory, and Professor at the University of South Australia in Adelaide, Australia, and also at the University of Auckland in Auckland, New Zealand. He earned a PhD in 2002 from the University of Washington and conducts research on how virtual and real worlds can be merged, publishing over 650 papers on Augmented Reality, Virtual Reality, remote collaboration, Empathic Computing, and related topics. In 2013 he was elected as a Fellow of the Royal Society of New Zealand, and in 2019 was given the ISMAR Career Impact Award in recognition for lifetime contribution to AR research and commercialization.

Categories
Events Past events

10 March 2022: Talk by Shengdong Zhao

Please save the date and join us for the joint USyd Basser Seminar – SydCHI talk by A./Prof. Shengdong Zhao.

You can register your interest to attend via the Eventbrite page.
Please share this with your students, colleagues, or anyone interested in this topic!

Date

Thursday, 10 March 2022, 2:00-3:30pm (via Zoom, link no longer up)

Heads-up Computing: Towards the next interaction paradigm

Abstract

Interaction paradigms (the style of interaction between humans and computers) can significantly change the way we work and live.
However, as much as we are empowered by interaction paradigms, we are also significantly constrained by them. Existing interaction paradigms limits our movements and activities, which can negatively affect our overall well-being. Desktop computing, described as “sitting at a desk, interpreting and manipulating symbols”, isolates human beings from interacting with other human beings and nature. Mobile computing, although free us from the office environment, demands constant eye-and-hand engagement, leading to the notorious phenomenon called “smartphone zombies”.

We need a new style of interaction that can better support human activities in nature and with other people as well as reducing cognitive load by blending reactive operations with appropriately designed proactive initiatives that can offer just-in-time assistance.

Bio

Dr. Shengdong Zhao is an Associate Professor in the Department of Computer Science, National University of Singapore. He established the NUS-HCI research lab. Dr. Zhao completed his Ph.D. degree in Computer Science at the University of Toronto. He also holds a Master’s degree in Information Management & Systems from the University of California at Berkeley.

Categories
Events Past events

11 Feb 2022: Talk by Barrett Ens

On 11 February 2022, from 4-5pm, Dr Barrett Ens will give a talk for the SydCHI community.

Using Space Around Us to Support Data Exploration

Abstract

The past several decades have brought about several iterations of computer form factors methods of interacting with them. While the computer itself has miniaturised, the computer as we know it has come to be defined by the rectangular glass form of its user interface. Immersive technologies now provide the potential for us to ‘step through the glass’ and interact with information in the 3D space around us. Can spatial interaction improve the way we perceive, interact with, and understand information? In this talk I will present my work on spatial interface design, and recent applications in immersive analytics. I will discuss what we have learned about the benefits of using space around us and some of the challenges that lie ahead.

Bio

Dr Barrett Ens is currently a member of the Data Visualisation and Immersive Analytics research group at Monash University. His research interests include novel input methods for augmented reality, spatial user interfaces, and immersive analytics. During his PhD work at the University of Manitoba, he completed two research internships with the User Interface Research Group at Autodesk Research. He later worked as an NSERC Postdoctoral Research Fellow at UniSA’s Empathic Computing Lab.

Please do not hesitate to share this with your students and colleagues!

Categories
Events Past events

19 Nov 2021: Talk by Shlomo Berkovsky

On 19 November 2021, from 12-1pm, A/Prof Shlomo Berkovsky will give a talk for the SydCHI community.

Sign up via Eventbrite. We’ll email a Zoom link for participants to join 1-2 days in advance.

Health Personalisation: From Wellbeing to Medicine

Abstract

The current agenda of health personalisation research mainly revolves around lifestyle and wellbeing. A number of works on personalised technologies for physical activity, food intake, mental support, health information consumption, and more have been published recently. While these mainly addressed the patient as the recipient of the personalised service, strikingly little attention has been paid to personalised medical applications targeting clinical users. In this talk, we turn the spotlight to such medical use-cases and the advantages personalisation can bring there. We will overview the established health care processes and highlight the touch points, where personalised support can improve the clinician’s decision making. Also, we will discuss the differences between patient- and clinician-facing personalisation, particularly focussing on risk, trust, and explainability.

Bio

Shlomo Berkovsky is a Computer Scientist, with deep theoretical and applied expertise in several areas related to human-centric applications of Artificial Intelligence and Machine Learning. His original research areas include user modelling and personalised technologies. Currently, he leads the Precision Health research stream at the Centre for Health Informatics, Macquarie University. The stream focusses on the use of AI to develop patient models and personalised predictions of diagnosis and care, and studies how sensors can be deployed to predict medical conditions, and how clinicians and patients interact with health technologies.

To join the talk, sign up via Eventbrite.

Please do not hesitate to share this with your students and colleagues!

Categories
Events Past events

22 Oct 2021: Talk by Judy Kay

On 22 October 2021, from 12-1pm, Prof Judy Kay will give a talk for the SydCHI community.

Sign up via Eventbrite. We’ll email a Zoom link for participants to join 1-2 days in advance.

In a World Awash With Personal Data, How Can We Empower People To Harness And Control Their Data

Abstract of Judy’s talk

As technology pervades our lives in an increasingly rich ecosystem of digital devices, they can capture huge amounts of long-term personal data. A core theme of my research has been to create systems and interfaces that enable people to harness and control that data and its use. This talk will share key insights a series of case studies from that work and plans to build upon these. The first case studies explored how to harness data from wearables, such as smart watches, for personal informatics interfaces that help us gain insights about ourselves over the long term, for analysis of a large dataset (over 140,000 people) and for Virtual Reality games for exercise.   The second set of case studies are from formal education settings where personal data interfaces, called Open Learner Models (OLMs), can harness learning data. I will share key insights that have emerged for a research agenda: OLMs for life-wide learning; the nature of the different interfaces needed for fast, versus slow and considered, thinking; communicating uncertainty; scaffolding people to really learn about themselves from their data; and how these link to urgent challenges of education in an age of AI, fake news and truth decay.

Judy’s bio

Judy Kay is Payne-Scott Distinguished Professor of Computer Science. She leads the Human Centred Technology Research Cluster, in the Faculty of Engineering at the University of Sydney. A core focus of her research has been to create infrastructures and interfaces for personalisation, especially to support people in lifelong, life-wide learning. This ranges from formal education settings to supporting people in harnessing the long-term data from their personal digital eco-system, to support self-monitoring, reflection and planning. She has created new forms of interaction including virtual reality, surface computing, wearables and ambient displays. Her research has been commercialised and deployed and she has extensive publications in leading venues for research in user modelling, AIED, human computer interaction and ubicomp.  She has had leadership roles in top conferences in these areas and is Editor-in-Chief of the IJAIED, International Journal of Artificial Intelligence in Education (IJAIED) and Editor of IMWUT, Interactive Mobile Wearable and Ubiquitous Technology (IMWUT).

To join the talk, sign up via Eventbrite.

Please do not hesitate to share this with your students and colleagues!

Categories
Events Past events

17 Sept 2021: Talk by Aaron Quigley

On 17 September 2021, from 12-1pm, Prof Aaron Quigley will give a talk for the SydCHI community.

Sign up via Eventbrite. We’ll email a Zoom link for participants to join 1-2 days in advance.

Human Object Interaction

Abstract of Aaron’s talk

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk I will delve into three forms of sensing for object detection and interaction with radar, blurred images and touch along with discussing future research directions.

Aaron’s bio

Aaron Quigley is a Professor of Computer Science in the School of Computer Science and Engineering (CSE) in the University of New South Wales in Sydney Australia and he serves as Head of School. Aaron’s research interests include discreet computing, global HCI, pervasive and ubiquitous computing and information. He has published over 190 internationally peer-reviewed publications including edited volumes, journal papers, book chapters, conference and workshop papers. Aaron is an ACM Distinguished Member, general co-chair for the ACM CHI Conference on Human Factors in Computing Systems in 2021, technical program chair for the ACM EICS conference, serves on the ACM CHI Steering committee and serves on the Yirigaa Advisory Board.

You read more on his website: https://aaronquigley.org/biography/

To join the talk, sign up via Eventbrite.

Categories
SydCHI

29 June 2021: New Committee appointed

We’re pleased to announce the SydCHI Committee for 2021! They take over from the inaugural Committee that began SydCHI a little over a year ago.

New Committee

A warm thank you to the outgoing Committee of 2020:

Categories
Events Past events

16 March 2021: ACM Distinguished Speaker talk by Prof Stephen Brewster

On 16 March 2021, from 9am to 10am, Prof Stephen Brewster will give a talk for the SydCHI community. We encourage everyone to join in via Zoom.

Haptics and Levitation Interfaces: The future of human-computer interaction?

Abstract of Stephen’s talk

Ultrasound provides some brand new opportunities for interaction in user interfaces. In this talk, I will describe this new modality and what it offers to HCI. By using standard loudspeakers, we can create soundfields that generate haptic feedback in mid-air, without the user having to hold or touch anything. We can control the position and texture of this feedback in real time. This ‘mid-air’ haptics enables new interaction techniques around devices. I will give examples of how it can be used for virtual controls and how novel interactions can be designed.

Bio of Prof Stephen Brewster

Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow where he leads the Multimodal Interaction Group, which is very active and has a strong international reputation in HCI (http://mig.dcs.gla.ac.uk/). His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly audio, haptics and gesture) to create a rich, natural interaction between human and computer. He pioneered the study of non-speech audio and haptic interaction for mobile devices with work starting in the 1990’s. Brewster’s work has had over 18,000 citations. He was a General Chair of CHI 2019 in Glasgow, CHI papers chair in 2013 and 2014. He is a member of the ACM SIGCHI Academy, an ACM Distinguished Speaker and a Fellow of the Royal Society of Edinburgh.

This talk is hosted as part of the Basser Seminar Series, by the School of Computer Science at the University of Sydney. Of course, everyone can join us online as Prof Brewster will join us remotely.