Basically, it takes your live video and applies all sorts of trippy effects - ASCII art, kaleidoscope patterns, edge detection, you name it. You can control everything with hand gestures (which is surprisingly fun) or voice commands if you're feeling fancy. I made it with pygame, mediapipe, scipy, imutils and more i used python for it. I did use ai to write some of the explaining documents (had wrote them before but then my laptop broke before i could commit and did not want to go through that all again so i appologize for that)
Basically, it takes your live video and applies all sorts of trippy effects - ASCII art, kaleidoscope patterns, edge detection, you name it. You can control everything with hand gestures (which is surprisingly fun) or voice commands if you're feeling fancy. I made it with pygame, mediapipe, scipy, imutils and more i used python for it. I did use ai to write some of the explaining documents (had wrote them before but then my laptop broke before i could commit and did not want to go through that all again so i appologize for that)