[devnexus 2024] breaking ai: live coding and hacking apps wih generative ai

Speaker: Micah Silverman

For more, see the 2024 DevNexus Blog Table of Contents


Changes/Trends

  • AI is not a silver bullet
  • Treat AI like a junior dev and verify everything produced
  • New iteration of copy/paste. More code that got from Stack Overflow
  • We don’t do thorough code reviews of libary code. Loo at security, number commiers, when last release
  • Different because our name now on commit
  • More challenging to maintain visiblity
  • Backlog ever growing

Common use in Dev

  • Addig coents
  • Summarizing code
  • Writing Readme
  • Refactorin code
  • Providing templates
  • Pair programming
  • Generating code (the new stack overflow)

Stats

  • 92% software dev using AI in some form
  • Those who use AI are 57% faster
  • Those who use AI are 27% more likely to complete task
  • 40% of co pilot generated code conains vulnrabiliies. Same stat without AI s o this is what we trained it on
  • Those using AI wrote less secure code but believed ore secure. We trust AI too much. Junior devs get more scrutinity than senior devs
  • [Some of these stats aren’t austation. Ex: early adopters]

Using AI well

  • Good starting point
  • Don’t use without review

Problems

  • Hallucinations – including supporting “evidence” even where wrong. Gave addition example
  • ChatGPT got worse in math over a few months. 98% to 2%. Now some math specific AIs
  • ”ChatGPT is confidentally wrong” – Eelko de Vos
  • First defamation lawsuit – ChatGPT made up case law
  • AI doesn’t know when wrong

AI and Code

  • Asked for Express app taking name as a request parameter. Tried a bunch of times and name parameter never sanizitzed so cross site scripting vulnerabilities. Ideally wouldn’t auto generate vulnerabilities
  • Can give bad advice – asked if code was safe from NoSQL inection. GPT and Bard said safe. Was not safe.
  • Samsung put all code in ChatGPT and leaked code, keys, trade secrets. Became part of training data. ChatGPT says got better about dealing with secrets.
  • Terms of services of ChatGPT say can use anything as training data

Sample Conference App

  • Using CoPilot
  • Spring Boot app in IntellIJ

Basic example

  • Gave co-pilot comments/prompts to create code
  • Showed generating code to read a file that doesn’t exist due to a typo

JPA SQL injection example

  • Showed not the most common way to write JPA here in 2024; usually don’t need direct query
  • Showed copilot offers prompt to get result
  • Tried adding to prompt to protect against SQL injection. Got a naive regex sanitizer
  • Then tried requesting named parameters in the query. After that did promps for setting parameter and getting result. All was well.

Example

  • Tried getting file then wih a file separator
  • Successfully got constant defined in file so some context sensitivity. But ignored file separator reques fro propt
  • Then requesting saving the file and successfully geerated code to write it
  • Requested getting a person and successfully called already written getPerson() method
  • Then set image name
  • Co pilot offered to write prompt to save person.
  • Then requested adding the message and added attribute to model
  • However, has path traversal issue
  • Showed BurpSuite monitoring as run example. “Send to repeater” keeps cookies and such while letting alter request.
  • Changed file name to ../image/snyklogo.png”. If works will replace logo with uploaded pic
  • Showed Synk IDE extension which also note path traversal issue
  • Tried asking to sanitize input against path traversal an got check for two dots. Not good enough but a first pass
  • Tried asking for whitelist to protect against directory traeversal. Checked directory name prefix which is better but also not a whitelist
  • Then tried requesting to validae that there is not a path traversal using the normalize method. Did what requested including the prefix check from previous prompt but Micah noted that’s what did in the past

What can do

  • Use tool to scan code and tell when makes mistake
  • Learn. Ex Snyk has a lesson on prompt injection. – Getting AI to tell info it shouldn’t

Future

  • Currently have Math aware AIs. Maybe will have security aware AI in future

My take

Good mix of background and live code. I like that Micah didn’t assume security knowlege while keeping it engaging for people who are familiar. I had never seen BurpSuite so that was a happy bonus in the demo.

[devnexus 2024] More tales from the Dark Side: How AI is the bad guys new friend[devnexus 2024] dark tales ai

Speaker: Stevel Poole

@spoole167

For more, see the 2024 DevNexus Blog Table of Contents


General

  • Supply chain
  • Now we are all attack vectores

Wifi

  • We also use wifi
  • How many use VPN?
  • Easy to spoof wifi
  • Only need battery, raspberry pi and a few more things
  • Would you notice a box on the wall?

Charger

  • Plug in Mac laptop charger at conference
  • If leave unattended, someone could add hardware
  • Any USB has problem
  • USB data cable and power cable look same

Hotel rooms

  • Hidden camera
  • In some countries during cold war, used human cherography to influence where sit
  • Becoming more common
  • More people are pass thru to company now

Phishing

  • Getting better
  • More targetting. Can know how company does things. Or knowing boss;’ namePhishing -> Spear Phishing -> Personalized Attacks
  • Moving towards more organized and long term attacks

Adding AI

Bad things can do

  • Deepfake nude generator
  • Deepfake phishing grew by three thousand percent in 2023

Why now

  • Not hard to do a reasonable fake. USB acceleration is sixty bucks
  • Huggingface.co has lots of models
  • Models and data avaialble to you and bad guys

Other problems

How Protect

  • Paper on identifying mouth inconsistencies for lip synching
  • Text/numbers wrong
  • Find anomalies from lack of training data – this is going to be an arms race. Once AI knows wrong, can do better next time.
  • Be more suspicious
  • Secure supply chain – all the pieces involved in creating and delivering software
  • Control AI tools in process
  • Look at where models came from and decide if safe. Will have to prove where got it from
  • Consider how train AI and when retrain it
  • Government wants a SBOM, automated supply chain, evidence of software integrity and regular aduit
  • SBOM (software bill of materials) don’t find malicious code but ensure you know what have

My take

Demos were great. Security has changed a lot. Good emphasis on depending on how much money you spend at it. It’s scary, but supposed to be. Need to think about what else I can do in my own life.

Someone challenged saying the grandparent scam sounds fake and nothing like the person. Steve didn’t get to reply, but it’s not a fare analogy. The grandparent same isn’t targeting (at least not much). Some targeting you specifically will have audio/bideo of you to base it off of. And then we are back to the 7 seconds is enough.

[dev nexus 2024] teaching your kid programming from the perspective of a kid

Speaker: Cassandra Chin

@cassandraonjava

For more, see the 2024 DevNexus Blog Table of Contents


General

  • Steven Chin’s daughter.
  • Worked with coding and YAML in MInecraft
  • Starting teaching kids to program at 14 at conferences
  • Junior in college
  • Creating podcast at internship for younger people (ex college)

Tech diversity

  • 20 years of feale tech panels and still need
  • Women who try AP Comp Sci in high school ten times more like to major it.
  • Black/Latino students seven times more lilkely.
  • Need to provide opportunity
  • Even at 6 year old, kids think computers are more suited to boys. Fifth grade it tapers down so sweet spot for starting.

Kids and code

  • Schools mandate human/world languages, but not coding languages
  • Since schools dont always provide, parents need to
  • Not all screen time is equal
  • Limit youtube
  • Minecraft in middle
  • Best use is learning to code – ex: Scratch
  • Redirect computer use vs taking away

Mistakes for parents to avoid

  • Don’t leave your daughters out. Bring to tech event
  • Computers at home matter – an actual computer, not a tablet. Lets do more than play mobile games
  • Don’t need to be good at math. While Assembly requires math, nobody uses anymore Modern programs use logic, not math
  • Kids dislike math the most followed by foreight language. Computers is third highest. Both things above are types of art.
  • Don’t start with books like Discrete Math
  • Give examples of programmers that they can relate to
  • Don’t start with boring parts like what an array is. Better to start with legos
  • Don’t do the code for the kids. They won’t learn. Never grab mouse or keyboard. Means content too har

Geniuses

  • Anyone can learn to code. Don’t have to be super smart.
  • Kids told programmers are genious do worse than kids who think practies will make them better

Books

  • Phippys AI Friend – comes with online workshop that takes about an hour. Actually use boo as prop
  • Coding for Kids Python
  • GIrls who Code

Helping kids

  • Relate to your kids hobbies. Ex: discuss who built
  • Lego Spike – build robot and do block coding
  • Mbot (Make Block). Uses screws instead of legos. Don’t have to use blocks
  • Hour of Code. Lots of themes
  • Choose age appropriate. Often we choose twoo hard
  • Squishy circuits for 3-9 year olds
  • Raspberry Pi and Arduino – 9-15 years old
  • Groups of two works best. When three kids, the younest will often feel left out
  • Take kids to localy run workshops – ex: confernces, girls who code

My take

I like her responses to Todd’s mini interview a the begining while they dealt with AV issues. Great humor. I liked that she made a joke about her dad being there to tell jokes. I also like “I’m not the daughter of Steven Chin; I have a name”. Great content throughout hether new to the topic or not.

The content resonated well. I gave my best friends five year old (daughter) a toy robot for her fifth birthday. I enjoyed seeing her play. I now have a gift idea for next year!

I also liked the demo from her book!