Skip to content

Developed a multimodal interactive quiz app allowing users to select answers via hand gestures. Created a user-friendly UI/UX in Figma and built the front end with React Native, using MongoDB for data management. Implemented a backend with Express and Node.js, and trained CNN models in Python for gesture recognition, enhancing user engagement.

Notifications You must be signed in to change notification settings

zshafique25/QUIZLY-APP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

QUIZLY-APP

The Multimodal Android Quiz Application is a project aimed at developing an interactive quiz application that utilizes hand gestures for selecting quiz answers. The application incorporates various technologies, including Figma for UI/UX design, React Native for front-end development, MongoDB for database management, Express and Node.js for backend development, Python for machine learning models, and APIs for model integration. Convolutional Neural Networks (CNNs) are used for model training.

Objectives

The objectives of the project are as follows:

  1. Develop a user-friendly and visually appealing UI/UX design using Figma.
  2. Implement the front-end of the application using React Native.
  3. Manage the database using MongoDB for efficient data storage and retrieval.
  4. Build the backend using Express and Node.js to handle application logic and API integration.
  5. Utilize Python for training machine learning models for hand gesture recognition.
  6. Integrate the machine learning models with APIs to enable real-time quiz answer selection.
  7. Employ CNNs for model training to achieve high accuracy.
  8. Implement hand gesture recognition to allow users to select quiz answers using hand gestures.

Images

image image

Demo Video

Demo Video Link: https://drive.google.com/drive/folders/1KCmJev2TkMfPkxZ5iwlgKbUSJdSevLQR?usp=sharing

About

Developed a multimodal interactive quiz app allowing users to select answers via hand gestures. Created a user-friendly UI/UX in Figma and built the front end with React Native, using MongoDB for data management. Implemented a backend with Express and Node.js, and trained CNN models in Python for gesture recognition, enhancing user engagement.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published