QCon 2018 – Rethinking HCI with Neural Interfaces

Title: Rethinking HCI with Neural Interfaces
Speaker: Adam Berenzweig @madadam

See the table of contents for more blog posts from the conference.


Minority Report analysis

  • why need gloves to interface
  • ergonomics – tiring to hold arm up

History of UI Paradigm Shifts

  • Command line – we still use the command line; just not exclusively
  • mouse, graphics – original Apple. Design innovation; not just tech
  • minesweeper and solitaire built in so could learn how to use the mouse – right click for minesweeper and click/drag for solitarire
  • MIT wearable computing in 1993 paved way for Google Glass. [but successful]
  • Joysticks, gloves, body (Kinect), eye tracking, VR/AR headsets
  • Had audience raise hand if wearing a computer. Not many Apple watch people in the room
  • Future: tech is always there. It knows about the world around you and is always ready

Book recommendation: Rainbow’s End – an old man gets rejuvenated (or something) and comes back younger needing to learn new tech

Intro to Neural Interfaces

  • Interfaces devices to translate muscle movement into actions
  • Human input/output has high bandwidth compared to typing or the like. We think faster than we can relay information. Output constrained.
  • Myo – For amputee, arm where have electrode on arm that controls arm.
  • Neural interfaces have information would have sent to muscle or physical controller
  • Lots of stuff happens in the brain, but you don’t want all of it. You want the intentional part without having to filter out everything else. The motor cortex controls muscles so represents voluntarily control. Also don’t have to plan electrodes on brain.

Examples

  • Touch type without keyboard presence [not very practical as it is hard to touch type without seeing keys]
  • Mirrors intention of moving muscles even if physical attempt is blocked
  • VR/AR – more immersive experience

Designing for Neural Interfaces

  • Want to maximize control/minimize effort
  • Cognitive limits – what can people learn/retain
  • Mouse is two degrees of freedom, laser pointer is three. There is also six where control in space. Human body has ore than six degrees of freedom. Are humans capable of controlling an octopus
  • How efficient is the input. Compared to existing control devices
  • It is possible to control three cursors at once, but it is exhausting. Not a good design
  • Different people find different things intuitive. Which way is up?
  • Don’t translate existing UIs. Can evolve over time.

My take

Fun! Great mix of pictures, videos and concepts. I learned a lot. Would be interesting to see this vs the privacy/ethics track. Imagining what data it could have reading your mind/muscles.

QCon 2018 – Privacy Ethics – A Big Data Problem

Title: Privacy Ethics – A Big Data Problem
Speaker: Raghu Gollamudi

See the table of contents for more blog posts from the conference.


GPDR (General Data Protection Regulation) – took effect May 25, 2018

Data is exploding

  • Cost of storing data so low that it is essentially free
  • 250 petabytes of data a month. What comes ater petabytes?
  • Getting more data when acquire other companies
  • IOT data is ending up in massive data lakes

Sensitive information – varies by domain

  • Usernames
  • user base – customers could be sensitive for a law firm
  • location – the issue with a fitness tracker identifing location of a military base
  • purchases – disclosing someone is pregnant before they tell people
  • employee data

changes over time – collecting more data after decision made to log

Privacy vs security

  • privacy – individual right, focus on how data used, depends on context
  • security – protect information, focus on confidentiality/accessibility, explicit controls
  • privacy is an under invested market. Security is more mature [but still an issue]

Solutions

  • culture
  • invest more – GDPR fines orders of magniude higher than privacy budget
  • include in perormance reviews
  • barrier to entry – must do at least what Facebook does if in that space
  • security – encrypt, Anonymization/pseudonyization, audit logs, store credentials in vault
  • reuse – use solutions available to you
  • design for data integrity, authorization, conservative approach to privacy settings
  • include privacy related tasks in sprint
  • design in data retention – how long do you need it for
  • automation – label data (tag/classify/confidence score)   So can automate compliance. Score helps reduce false positives

EU currently strictest privacy policy  Germany and Brazil working on. There was a debate on whether it applies to EU citizens or residents. Mostly agreement that physical location matters

My take

I was expectng this to be more technical. There was a little about the implications of big data like automation. But it felt glossed over. I would have liked to see an example of some technique that involves big data. The session was fine. It covered a lot of areas in passing which is a good opening session – lets you know where to plan. I think not having the “what you will learn” session on the abstract made it harder to know what to expect. Maybe QCon should make this mandatory?

QCon 2018 – Keynote – Developers as Malware Distribution Vehicle

Title: Developers as a Malware Distribution Vehicle
Speaker: Guy Podjarny @GuyPod

See the table of contents for more blog posts from the conference.


Developers have more power  than ever – can get more done and faster. Can also do more harm.

XCodeGhost – in 2015

  • XCode went from 3GB to 5GB
  • Too slow to download in China
  • Developers use a local mirror
  • Have to trust unofficial download
  • XCodeGhost is  XCode + a malicious component that compiles in to the OS. It targets the linker.
  • Went undetected for 4 months
  • Contamiated hunreds of Chinese apps and dozens of US apps
  • US got it fro Chinese built apps and via a lirary
  • Got up to 1.4M active victims a day
  • Apple fixed in AppStore imediately, but took months for users. Including enterprises
  • The real “fix” was to take down the websites were contacting
  • Apple fixed root problem by hosting official XCode download in China
  • Because targeted linker, developers were the distirbution vehicle.

Delphi virus – Induc – 2009

  • Targets Delphi
  • Every program copiled on machine is affected
  • Even if uninstall and reinstall Dephi, it stays
  • Took   10 minutes to find
  • No app store, so harder to remove
  • Affected millions

First instance of this concept  – 1984

  • ”Reflections  on Trusting Trust” – Ken Thompson
  • Modify C compiler to “miscompile”
  • Three trojans – allow a hard coded password, replicate the logic in C Compiler and use a disassembler to hide and deletes from source code
  • Wrote a proof of concept. Think didn’t escape Bell labs
  • Can’t find. Not in source code and can’t disassemble
  • Best soluion is to compile on two computers/compilers and compare the output. Not practical.

Malicious dependencies

  • npm bad  dependency
  • pipy  bad dependenc this year
  • Docker bad image this month

Must trust the people who write the software.

We ship code faster.   Hard to find if deveoper introduces code maliciously or accidentally.

Developers have access to user data Be careful

Syrian Army and Financial Times

  • phishing email
  • link redirects to finanicial times spoofed page
  • now have emails so send emails that look  like from finanical times
  • IT attempted to warn users.
  • Attacker send identical email with evil links
  • Gain access to official twitter
  • Syrian Army use to make statements
  • A developer noted that think wise to this and still fall for it. We all fall for this.
  • Salesforce did an internal phishing test and developers were the second higest clickers

Uber – 2016

  • Attackers  got driver and user data
  • Uber paid 100K ransom. Agreed later that shouldn’t
  • Public found out a year later
  • Developers had stored  S3 token in  private github repo
  • Not using 2FA
  • Deveopers can access extremely sensitive data and  share it too often

As we get more power, we need to get more responsible

Causes of  insecure decisions:

  • Different motivations  – focus    On functonality. Security is a constraint. Need to be cognizant of it
  • Cognitive limitations – we move fast and break things
  • Lack of expertise – don’t always understand security implications
  • Developers are overconfidence. Harder to train where think know it.
  • ”It doesn’t happen to me” .  Security breaches happen to everyone.

Mitigations

  • Learn from past incidents
  • Automate security controls
  • Make it easy to be secure
  • Developer education
  • Manage access like the tech giants
  • Challenge access requests.  When need. For how long. What happens  if don’t have access. What can go wrong with access? How would you find out about access being compromised?

Google BeyondCorp

  • All access route through corporate proxy
  • Proxy grants access per device – limits what can do from Starbucks
  • Monitoring access

Microsoft Privileged Access Workstations (PAW)

  • Access to production can only be from a secure machine
  • No internet from the secure machine
  • Your machine is VM  on secure machine

My take

Great start to the day. I had known about some of these, but not others. For some reason, this reminds me of developer ghost storires.