Spaces:
Sleeping
Sleeping

Initial commit of the Computer Vision Journey presentation, including main application files, project pages, assets, and configuration. Added .gitignore to exclude unnecessary files and created requirements.txt for dependencies.
27818c1
unverified
import streamlit as st | |
import plotly.graph_objects as go | |
from streamlit_extras.badges import badge | |
# Set page configuration | |
st.set_page_config( | |
page_title="Asimo Foundation | CV Journey", | |
page_icon="π€", | |
layout="wide", | |
initial_sidebar_state="expanded", | |
) | |
# Title and introduction | |
st.header("π€ Asimo Foundation - STEM Education") | |
st.markdown( | |
""" | |
### Bringing Technology Education to Public Schools | |
The Asimo Foundation is a social project at UNIFEI that aims to reduce educational inequality | |
by bringing STEAM (Science, Technology, Engineering, Arts, and Mathematics) education to public schools in the region. | |
This initiative: | |
- Introduces students to robotics, programming, and technology | |
- Provides hands-on experience with Arduino, Lego Mindstorms, and ESP32 | |
- Develops problem-solving and critical thinking skills | |
- Inspires interest in technology and engineering careers | |
""" | |
) | |
# Project details in tabs | |
project_tabs = st.tabs(["Mission & Impact", "Technologies", "Teaching Methodology"]) | |
with project_tabs[0]: | |
col1, col2 = st.columns([3, 2]) | |
with col1: | |
st.markdown( | |
""" | |
### Our Mission | |
The Asimo Foundation believes that all students, regardless of socioeconomic background, | |
deserve access to high-quality STEM education. By bringing technology education to public | |
schools, we aim to: | |
- **Bridge the digital divide** between private and public education | |
- **Empower students** with technical skills for the future job market | |
- **Inspire curiosity and innovation** in young minds | |
- **Provide university students** with teaching experience and community engagement | |
""" | |
) | |
with project_tabs[1]: | |
col1, col2, col3 = st.columns(3) | |
with col1: | |
st.markdown( | |
""" | |
### Arduino | |
**Applications:** | |
- Basic circuits and electronics | |
- Sensor integration (temperature, light, distance) | |
- Simple robotics projects (line followers, obstacle avoidance) | |
- LED control and displays | |
**Benefits:** | |
- Low cost and widely available | |
- Excellent introduction to programming and electronics | |
- Versatile platform with thousands of project examples | |
""" | |
) | |
with col2: | |
st.markdown( | |
""" | |
### Lego Mindstorms | |
**Applications:** | |
- Robot construction and design | |
- Visual programming introduction | |
- Sensor integration and robotics concepts | |
- Competitive challenges and problem-solving | |
**Benefits:** | |
- Intuitive building system | |
- Robust components for classroom use | |
- Engaging form factor that appeals to students | |
- Scaffolded learning progression | |
""" | |
) | |
with col3: | |
st.markdown( | |
""" | |
### ESP32 | |
**Applications:** | |
- IoT (Internet of Things) projects | |
- Wireless communication | |
- Advanced sensing and control | |
- Web-based interfaces | |
**Benefits:** | |
- Built-in Wi-Fi and Bluetooth | |
- Powerful processing capabilities | |
- Low power consumption | |
- Bridge to more advanced applications | |
""" | |
) | |
with project_tabs[2]: | |
st.markdown( | |
""" | |
### Our Teaching Approach | |
We follow a project-based learning methodology that emphasizes: | |
1. **Hands-on Exploration:** Students learn by doing, building, and experimenting | |
2. **Collaborative Problem-Solving:** Group projects that encourage teamwork | |
3. **Incremental Challenges:** Starting with simple concepts and building to complex projects | |
4. **Real-World Applications:** Connecting technology concepts to everyday life | |
5. **Student-Led Innovation:** Encouraging creativity and independent thinking | |
This approach ensures that students not only learn technical skills but also develop critical thinking, | |
collaboration, and self-confidence. | |
""" | |
) | |
st.markdown("---") | |
# Gesture-controlled robotic arm project | |
st.subheader("Featured Project: Gesture-Controlled Robotic Arm") | |
col1, col2 = st.columns(2) | |
with col1: | |
st.markdown( | |
""" | |
### Computer Vision Meets Robotics | |
This project combines computer vision with robotic control to create an intuitive | |
interface for controlling a robotic arm using hand gestures. | |
**How it works:** | |
1. A webcam captures the user's hand movements | |
2. MediaPipe hand tracking detects hand landmarks in real-time | |
3. Custom algorithms convert hand position to servo angles | |
4. Arduino/ESP32 receives commands and controls the servo motors | |
5. The robotic arm mimics the user's hand movements | |
This project demonstrates how computer vision can create natural human-machine interfaces | |
and serves as an engaging introduction to both robotics and CV concepts. | |
""" | |
) | |
with col2: | |
# Placeholder for robotic arm image | |
st.image( | |
"assets/robotic_arm.jpg", | |
caption="Robotic Arm used in the Asimo Foundation project", | |
use_container_width=True, | |
) | |
# Technical implementation details | |
st.subheader("Technical Implementation") | |
implementation_tabs = st.tabs(["Hand Tracking", "Angle Calculation", "Arduino Control"]) | |
with implementation_tabs[0]: | |
st.markdown( | |
""" | |
### MediaPipe Hand Tracking | |
We use Google's MediaPipe framework to detect and track hand landmarks in real-time. | |
**Key Technologies:** | |
- [MediaPipe](https://developers.google.com/mediapipe) - Google's open-source framework for building multimodal ML pipelines | |
- [MediaPipe Hands](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker) - Specific solution for hand tracking | |
- [OpenCV](https://opencv.org/) - Open source computer vision library | |
**What it does:** | |
- Detects up to 21 landmarks on each hand | |
- Works in real-time on CPU | |
- Provides robust tracking even with partial occlusion | |
- Returns normalized 3D coordinates for each landmark | |
**Resources:** | |
- [MediaPipe GitHub](https://github.com/google/mediapipe) | |
- [Hand Tracking Tutorial](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker/python) | |
- [OpenCV Documentation](https://docs.opencv.org/) | |
""" | |
) | |
with implementation_tabs[1]: | |
st.markdown( | |
""" | |
### Mapping Hand Position to Servo Angles | |
Converting hand landmark positions to meaningful servo angles requires mathematical transformations. | |
**Key Technologies:** | |
- [NumPy](https://numpy.org/) - Fundamental package for scientific computing in Python | |
- [SciPy](https://scipy.org/) - Library for mathematics, science, and engineering | |
**What it does:** | |
- Calculates angles between landmarks | |
- Maps raw angles to appropriate servo ranges | |
- Applies smoothing and filtering to reduce jitter | |
- Converts 3D hand positions to robotic arm coordinate space | |
**Resources:** | |
- [NumPy Documentation](https://numpy.org/doc/stable/) | |
- [SciPy Spatial Transforms](https://docs.scipy.org/doc/scipy/reference/spatial.html) | |
- [Vector Mathematics Tutorial](https://realpython.com/python-linear-algebra/) | |
""" | |
) | |
with implementation_tabs[2]: | |
st.markdown( | |
""" | |
### Arduino Communication and Control | |
The calculated angles are sent to an Arduino to control the servos. | |
**Key Technologies:** | |
- [pyFirmata2](https://github.com/berndporr/pyFirmata2) - Python interface for the Firmata protocol | |
- [Firmata](https://github.com/firmata/arduino) - Protocol for communicating with microcontrollers | |
- [PySerial](https://pyserial.readthedocs.io/en/latest/) - Python serial port access library | |
- [Arduino Servo Library](https://www.arduino.cc/reference/en/libraries/servo/) - Controls servo motors | |
**What it does:** | |
- Establishes serial communication between Python and Arduino | |
- Formats and sends servo angle commands | |
- Controls multiple servo motors in the robotic arm | |
- Provides real-time response to hand position changes | |
""" | |
) | |
# Demo video | |
st.markdown("### Demo Video") | |
st.video( | |
"assets/hand_control_arm_video.mp4", | |
# caption="Demonstration of hand gesture-controlled robotic arm", | |
) | |
st.markdown( | |
""" | |
* [GitHub Repository](https://github.com/Fundacao-Asimo/RoboArm) | |
""" | |
) | |
# Educational impact | |
st.markdown("---") | |
st.subheader("Educational Impact") | |
st.markdown( | |
""" | |
### Learning Outcomes | |
- **Computer Vision Concepts:** Introduction to image processing, feature detection, and tracking | |
- **Robotics Fundamentals:** Servo control, degrees of freedom, coordinate systems | |
- **Programming Skills:** Python, Arduino/C++, communication protocols | |
- **Engineering Design:** System integration, calibration, testing | |
### Student Feedback | |
Students find this project particularly engaging because it: | |
- Provides immediate visual feedback | |
- Feels like "magic" when the arm responds to hand movements | |
- Combines multiple disciplines in a tangible application | |
- Offers many opportunities for creative extensions and customization | |
""" | |
) | |
# Footer with attribution | |
st.markdown("---") | |
st.markdown( | |
""" | |
### Project Team | |
This work was developed and implemented as part of the Asimo Foundation at UNIFEI. | |
Special thanks to all the volunteers, educators, and students who contributed to this initiative. | |
""" | |
) | |
st.markdown("[π Asimo Foundation](https://www.instagram.com/fundacaoasimo/)") | |