Research repository for TouchPose: Hand Pose Prediction, Depth Estimation, and Touch Classification from Capacitive Images. ACM UIST 2021. Dataset, models, and code.
-
Updated
Dec 14, 2021 - Jupyter Notebook
Research repository for TouchPose: Hand Pose Prediction, Depth Estimation, and Touch Classification from Capacitive Images. ACM UIST 2021. Dataset, models, and code.
Real time AI hand pose detection and Gesture recognition app build by using ReactJS and react webcam
Here is source code for an example of hand tracking using mediapipe library
[TCYB2018] Context-Aware Deep Spatio-Temporal Network for Hand Pose Estimation from Depth Images
Here is source code for an example of hand tracking application (Virtual calculator) using openCV and mediapipe. It is a project for practicing skills in using openCV and image processing in computer vision. With this code, you can create a Virtual calculator and do basic mathematical calculation.
[CVPR 2024] OHTA: One-shot Hand Avatar via Data-driven Implicit Priors
[CVPR 2022] "Mining Multi-View Information: A Strong Self-Supervised Framework for Depth-based 3D Hand Pose and Mesh Estimation"
Fast hand pose estimation on a bare Raspberry Pi 4 at 7 FPS
Control your computer using hand gestures with AI, using Google's MediaPipe and OAK-D Lite camera.
Hand Mesh Recovery models on OakInk-Image dataset
personal website
A few CoreML models that allow for American Sign Language alphabet detection.
HAND POSE Estimation with Tensorflow.JS and React.JS
The Official PyTorch Implementation of "MHEntropy: Multiple Hypotheses Meet Entropy for Pose and Shape Recovery" (ICCV 2023 Paper)
Real time AI hand pose estimation and positioning app build by using ReactJS and react webcam
Tensorflowjs handpose gesture recognition - This package provides utility function to create "landmark" array from keypoints & keypoints3D arrays; for the latest model of handpose.
SHOWMe: Benchmarking Object-agnostic Hand-Object 3D Reconstruction (Dataset, Contains proposed top baseline reconstructions with estimated camera poses)
This repo contains the code used for carrying experiments for Paper Id 50 in preregister workshop at NeurIPS 2020.
testSpectrogram is an open-source platform for wireless channel simulation, human/hand pose extraction, gesture spectrogram generation, and real-time gesture recognition based on millimeter-wave passive sensing and communication systems.
This started as a project to detect the various Sign Language alphabets, but since the code is not specific to Sign Language, this repository can be used for a wide variety of purposes wherever there is need to detect hand pose.
Add a description, image, and links to the hand-pose-estimation topic page so that developers can more easily learn about it.
To associate your repository with the hand-pose-estimation topic, visit your repo's landing page and select "manage topics."