Nighty Night
Created a multi-platform system consisting of sensors and camera designed to help analyze a baby’s sleeping position and other conditions to help parents and caretakers properly assess and improve a child’s bedtime. Studies have shown that sleeping sideways or on his/her belly increases the chance of non-desirable health conditions for a developing baby, potentially leading to deaths due to reasons such as the inability to breathe properly. (https://www.nichd.nih.gov/sites/default/files/publications/pubs/documents/SIDS_QA-508-rev.pdf). Therefore, it is recommended that babies to be sleeping on their back to minimize such risks. However, it is impossible for a human to keep track of a baby’s position all night long. Therefore, we have decided to build an application that would send notifications that would alert parents to change the baby’s position, serving as an immeasurable tool that eases the burden of child care. This application would be responsible for constantly monitoring a baby's health by tracking metrics such as physical body posture, head position, and breathing frequency.
Face Konnex identifies people and helps the user identify people, who they are, what they do, and how they can help others. Project won Top 30 at PennApps, Fall 2018.
3D first-person shooter game I made using unity3d back in 2017 for a game development course I took on Coursera.
Goal is to shoot green boxes to gain points. You are timed to shoot down certain number of green boxes. if you shoot white boxes you can get bonus time. Shoot yellow boxes you lose points.
Have Fun!
Roller Madness: A challenging coin collecting game designed in Unity3D.
Goal: Collect the coins and don't get hit by the AI You start with certain number of lives. After they are all used up game over and restart game Roll the ball and collect certain number of coins to win
Good Luck!!
You play as a character who shoots projectiles at airplanes in the sky. You must have a certain number of points to complete each level before timer runs out.
Inspiration came from playing games like Call of Duty & Halo. Made at HackTCNJ, Spring 2017
A nervous system disorder involving repetitive movements or unwanted sounds. There are more than 200,000 US cases per year and approximately 10-15 percent of those affected have a progressive or disabling course that lasts into adulthood. X-Waves tries to teach a “competing response,” which is the essence of habit reversal therapy. Since the brain can’t do two tics in the exact same instant, physical tics can essentially be neutralized over time.
The user can associate sound patterns to the feeling of comfort felt after doing physical tics, hence creating a reinforcing action that can convert physical tics to mental tics
An app that teaches users how to perform surgery and how to give clear directions in an operation room.
Primarily objection: Add AR objects onto clothing, as well as create something medical-related. helps users learn how to perform surgery on a pretend patient and how to give directions in an operation room. Some specifications that were not met were getting accurate human organ models, blood flow simulations, VR Plugin for surgery room and medical imaging and labeling techniques with machine learning.
Platform: AR, Mobile
Hardware Specifications: Android phone
Visual Interaction: Organ overlaying, AR Pulse check, Heart blood flow check, note taking features, speech to text API.
User Interaction is user getting close view into human anatomy to get a better idea and understanding about the human body along with ability to take notes on what they observe.
Target Audience: Medical education and training
Goal: Collect data based off how accurate the app is in helping user’s in understanding anatomy, heart structure, blood flow. In simple words have data collection on having the foundation understanding of the human body in a fun interactive way
Devoloped VR Clothing Simulation to help users with a novel method of interacting with clothing items sold by retail companies.
Problems Faced By Customers:
Certain customers may not be able to try on a clothing item due to monetary reasons.
If a customer wants to see how they look in a new item of clothing, he or she must travel to the store and physically try it on.
Main Objectives:
Allow people to see how they look in different looks in 3D space
Provide users with a changing room
Implement an online shopping platform that incorporates the aforementioned objectives
Detailed Objectives:
Allow the user to zoom in and zoom out of the item of clothing and allow the user to interact with the clothing item
Allow the user to choose from a database of clothing items in an intuitive manner
Allow the user to choose his or her gender as to reflect his or her body type
Technologies used to build this: Unity3D, Leap motion and Oculus Rift.
Unity and Oculus Rift: The primary VR engine we utilized was Unity and our main programming language is C#. The Oculus Rift provides the means of connecting the user to the virtual reality world.
Leap Motion: In order to track the movement of the hands and finger motions of the user, we utilized Leap Motion. The Interaction Engine allowed us to work with to connect this method of user input to VR. The Leap Motion sensor is attached to the laptop running the Unity scripts.
Next goals to further this project:
Implement machine learning and computer vision as to further tailor the human 3D model as to include parameters such as body shape, skin color, height, and etc.
Scan the user's face and project the image onto the 3D human model
Improve the mapping of the clothing onto the 3D human model
An interactive (Augmented Reality) gaming experience built at GSK Hackathon Summer 2019 Arcade-like booth where folks can play an Augmented Reality (AR) game to pass the time while they wait for the dentist Lasts for 2 minutes - 86.9% of dentists had wait times of 20 minutes or less* Fun, interactive experience for consumers to learn how to prevent enamel erosion ROI: Revenue from new consumers Help Sales Reps promote Sensodyne by: Tracking Oral Jam usage rates in each zipcode Measuring consumers’ oral health knowledge
Projects I did for my Computer Graphics class (CS428/CS523) @ Rutgers University. Entire Focus on the course was Computer Animation, Crowd Simulations, Games AI, Pedestrian Environment & Interactive Storytelling
Roll A ball
2 player game that involves collecting coins, avoiding collision with other ball and with walls. If either player collides
A simple crowd simulator, where you can select a set of agents using the mouse’s left click and make them navigate to a desired location
using the mouse’s right click.
Social Forces: Videos are broken down into 4 parts
Goal: A force that moves you along the path towards the goal. Using pathfinding to calculate this force is critical to avoiding local minima
Proximity: Repels an agent from a static obstacle like a wall
Repulsion: Repels an agent from other agents to avoid collisions
Sliding friction: If two agents are deadlocked this forces them to turn around eachother instead of pushing into eachother
Final class Project.
The car will move based on mouse click input where you click destination and it will drive there using the shortest pathfinding algorithm known A*. Car is able to drive through the street when the user clicks on a location, let's say where pedestrians are walking. The car will first let the pedestrians cross the street before it moves to its clicked location. The car also knows how to recognize detour areas and know which detour areas to avoid.
Click to move car navigation mostly worked with custom a* code, but wound up having a bug with setting the point programmatically so we, unfortunately, couldn’t run the simulation properly to collect data
Assignment for my Virtual Reality Lab class @ Rutgers University. Created a Virtual Drone Exploration game. The goal is to control a drone and collect 6 coins to win. If your drone land on the ground (fire) or collides with walls then the drone explodes.
Assignment for my Virtual Reality Lab class @ Rutgers University. Created a Halo-like game where the player plays as a First-person character. The goal is to shoot down targets with a gun and get score 30 or more points to win the game
Research oritented course (CS443) taken @ Rutgers University, taught by Dr. Mubbasir Kapadia. Course involves innvoating new research project using 3D game engines and other core cutting edge technologies such as AI/ML, Computer Vision & more. In addition, weekly lectures involve talks from presentors who are expert in reseacrh areas of crowd simulation environments, computer animation, virtual worlds & more. Goal of this course is to learn about latest research done in the field of computer graphics, listen to asstounding researchers present there work & have students make projects that can have potential publication value to research conferences such IEEE, Siggraph & more.
Impairment Aware Human-Building Design
Applications of encouraging human-aware designs to inform architectural decisions extend beyond finding alternative floor plans. Instead, this research can be used to assess the accessibility of buildings and critical situations including evacuation. In cases where architects do incorporate human-building interactions within their floor plan, it is difficult to assess the appropriateness of a floor plan for inclusive populations, including agents who are handicapped. Wayfinding is especially of importance and a key problem area, proving the need to make data-driven decisions for human-aware design. Additional areas on issue include accessibility for different disabilities, including wheelchairs, visual impairment, etc. Within each situation, there are different assessments of building floor plans. An important example of such is assessing emergency evacuations of buildings, which can be analyzed and optimized through simulations of movement during evacuations to find and correct possible bottleneck areas in a structure [8]. As specified through relevant research, it is of utmost importance to consider these critical agents when taking design considerations, only done through quantitative and qualitative analysis of human-building interactions.
NASA SUITS (Spacesuit User Interface Technologies for Students) challenges students to design and create spacesuit information displays within augmented reality (AR) environments. As NASA pursues Artemis - landing American astronauts on the Moon by 2024, the agency will accelerate investing in surface architecture and technology development. For exploration, it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrial access. The SUITS 2021 Challenges target key aspects of the Artemis mission.
As NASA pursues future Artemis missions that will land American astronauts on the moon by 2024, the agency is continuing to invest in human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrialenvironments. Astronauts need more assistance from modern technologies as we continue to extend human presence deeper into space. The next-generation lunar spacesuit, the Exploration Extravehicular Activity Mobility Unit (xEMU), defines requirements for a visual display system.
In future missions, a dynamic visual display system optimizes the effectiveness of astronaut surface operations. Displays like augmented reality (AR) are a tool to enable interfacing with lunar payloads, support science work, visualize consumables, streamline crew-to-crew
communication, bolster Mission Control Center (MCC) interaction methods and navigate terrain. The development of visual informatics display systems enables NASA to improve and innovate optics solutions within the emerging AR field, making future EVA missions more efficient andeffective.
Core Components include:
Navigation: The AR device must accurately guide the user in real-time and help navigate between the multiple EVA assets and a designated geology excavation site. Students use data from their HMD internal sensors (local device coordinates), standard navigation algorithms, appropriate imagery (for recognition/camera vision needs) and any other external data/information available to help users navigate in a lunar environment.
EVA System State:
The AR device must interact with a suit port to execute airlock activities (i.e. UIA, DCU, Intra-vehicular spacesuit prep).
The AR device must have the ability to interact with the designated suit telemetry stream and display vitals in an unobtrusive manner.
Geology Sampling: The AR device must aid for science sampling, science camera/imagery needs and geology field notes at a designated lunar simulation geology excavation site.
Auxiliary Considerations: Lighting Conditions -- The UI must accommodate in areas of high contrast between bright and shadowed regions as is present on the lunar south pole.
Control Methods -- A controls (could be a unique implementation or standard to the chosen AR device) method component to allow the user to efficiently and effectively execute EVA tasks.
System Tutorial – The UI must include a short training or welcome video that provides instructions to the user on how to use the features and navigate through the UI design. With this year’s fully remote test scenarios, there is critical need for new users and subject matter experts to learn onto each team’s UI environment.
enAbleGames a spin-off startup from Drexel University's Replay Labs that is creating customizable Active Video Games (AVG) to promote health and activity in people with disabilities.
In addition, enAbleGames is a game design and platform development company composed of physical therapists, biomedical engineers and computer scientists to provide affordable, effective gaming solutions for use in clinic and home-based health and rehabilitation services.
My responsibilities involved devoloping games for patients with neurological conditions such as Cerebral Palsy, Multiple Sclerosis, Parkinsons, Seizures.
Technologies I worked with were Unity3D, Microsoft Kinect for motion tracking use & enables games own custom asset libraries under the supervision of Dr. Paul Diefenbach, Associate Professor @ Drexel University,
Associate Program Director Game Art & Production (Digital Media) & director of the RePlay Lab.