Building Wide Intelligence Touch Kiosk UI


This is the user interface that runs on the touch kiosks in the elevator lobbies of the Bill and Melinda Gates Computer Science Complex and Dell Computer Science Hall (GDC) at The University of Texas at Austin. It is a part of the Building Wide Intelligence (BWI) research project which aims to give the building a coherent intelligent presence consisting of touchscreens and robots.

The main purpose of the touchscreen interface is to provide timely and interesting information to students, faculty, and visitors. The application is also capable of dispatching robots from the BWI lab to help with various tasks.


  • Keep up to date on UTCS news and events
  • Navigate GDC with maps of every floor
  • Find who you’re looking for with a searchable directory
  • Check what room your class meets in
  • Learn about GDC and the Computer Science department
  • Interact with Building Wide Intelligence (BWI) robots
  • Wheelchair Accessible


The frontend is a web application which runs in a sandboxed WebKit browser. Because of the specific hardware in use in the building, touch events come through as clicks. In order to facilitate searching on the directory, an on-screen keyboard was implemented in-app.

News, events, directory, room schedule, and map data comes from static JSON files generated by a cron job that reads from a variety of other sources. Touch events are sent back through a PHP endpoint for usage pattern analysis later. BWI robot interactions are handled through a Websocket connection (via roslibjs) to the backend, described below.


The backend is only partially developed right now. It is a Python-based ROS node that listens on topics published by the frontend via rosbridge and handles requests, storing relevant information in a MySQL database.