Vvim - Keyboardless Vim interactions

Related tags

Hardwarevvim
Overview

Vvim - Keyboardless Vim interactions

This is done via a hardware glove that the user wears. The glove detects the finger's positions and translates them into key presses. It's currently a work in progress.

The glove prototype, with 4 sensors on two fingers

Subset of data

The stream of data from the 4 sensors (here each shown in a different colour) has been zeroed so that they all centre around the time when the user pressed the 'y' key.

Current Features

  • Glove prototype has been constructed.

  • Glove can detect finger movements of the right fore finger and right middle finger (With space to expand to more fingers if these first two actually work)

    • This corresponds to the following keys, shown with how often those keys show up in the current dataset: h: 628, u: 291, y: 171, m: 171, b: 155, k: 120, j: 21,
  • Glove records finger movements via an Arduino script vvim.ino on an Uno, and sends them to serial output.

  • Serial output is read by the python script glove_logger.py and saved to the file glove.log along with the Unix milliseconds since epoch.

  • A keylogger is installed on the developer's machine, and logs key presses to the file keys.log along with Unix milliseconds since epoch.

  • Running cleanup.sh cleans up the data from the keylogger and the serial output into one file named sorted.log.

  • A Gradient Boosted tree has been trained and saved to model.pkl. Currently it has a test accuracy of 79.7%.

    • This will hopefully be improved as more data is gathered, as currently there are only 587 keypresses on which to train 9 categories, or about 65 examples per category which is not enough.
  • The file eda.py saves plots to plots/ such as:

Graphs

Each colour is a differently positioned sensor. Each line is one stream of data recorded by a sensor. The streams have each been zeroed so that every instance of pressing a certain key is centred.

Keys on the home row

Some keys are easier to spot, and others less so as my fingers move a lot when pressing a y compared to a k just because of where the keys are positioned on the keyboard.

More or less data

The data has not been normalised, so there's far more data for when common keys like h are pressed compared to when a j is pressed

In Progress

  • Currently there are only about 600 keypresses recorded. Record more examples of typing and add more sensors to the fingers so that fewer keystrokes have to be typed in order to get the data.

To Do

  • If flex sensors aren't enough to predict exactly when a key is pressed, add force sensors to the fingertips.
  • Use an Arduino Nano instead of an Uno, and host the entire thing on the user's hand
  • Connect the glove to the computer via Bluetooth, instead of a wired connection
  • Current models don't have the option of categorizing an sequence of sensor readings as not pressing any key at all. This should be fixed so the model isn't constantly assuming at least one key is being pressed
    • This could be done easily with pressure sensors
  • Write some sort of visualiser to live track sensor data, actual key presses, and predicted key presses

Keys and which finger tends to press them

Note that this list is likely very specific to the author, as different people will type differently. I think I probably use my right ring finger much more than I really should. Also I type a y with my index finger for words like type or you (where I subsequently have to type another letter with me right hand), but I type it with my middle finger for words like yes, yank, or keyboard.

  • Right Hand
    • Thumb: space
    • Index: j, m, n, b, h, y
    • Middle: k, y, u, i, <, (, [
    • Ring: l, :, BACKSPACE, o, p, >, ), ], 0, _, -, +, =, ,, .
    • Pinky: ;, ENTER, /, ?
  • Left Hand (Incomplete as I've not yet built a glove for the left hand)
    • Pinky:
    • Ring:
    • Middle:
    • Index:
    • Thumb:

Here's a picture of my keyboard for reference:

How to Start Recording Data

Probably best to do this all in tmux since handling multiple terminal windows is a pain otherwise. A keylogger (I use Casey Scarborough's keylogger) is also required.

  1. Install requirements
pip3 install -r requirements.txt
  1. Run the command to clear the logfile:
sudo keylogger clear
  1. Start the keylogger:
sudo keylogger ./keys.log
  1. Start recording glove movements:
python3 glove_logger.py
  1. Put the glove on, and start typing things out. I usually do this by opening a text file (like Alice in Wonderland available on Gutenberg) in vim (vim alice.txt), and then splitting the window vertically (:vsp), and then opening a temporary file in which to type in (:e tmp). Finally, type (:set cursorbind) into both frames so that the source text scrolls as you type it. They keystrokes and finger movements will be recorded separately

  2. Remove the glove

  3. Stop the keylogger with CTRL-C

  4. Stop recording the finger movements with CTRL-C

  5. Now the data is recorded, clean it up:

./cleanup.sh
  1. And analyse the data with eda.py
python3 eda.py

The images will be stored to plots/ for your viewing pleasure

License

This work is licensed under GNU GPLv3. See the attached LICENSE. See https://choosealicense.com/licenses/gpl-3.0/# for a non-legalese explanation of the license.

Owner
Boyd Kane
CS and Statistics student at UCT, South Africa. Interested in data science, probability theory and speaking Spanish. ¡Hola!
Boyd Kane
A Python program that makes it easy to manage modules on a CircuitPython device!

CircuitPython-Bundle-Manager-v2 A Python program that makes it easy to manage modules on a CircuitPython device! The CircuitPython Bundle Manager v2 i

Ckyiu 1 Dec 18, 2021
Connect a TeslaMate instance to Home Assistant, using MQTT

TeslaBuddy Connect a TeslaMate instance to Home Assistant, using MQTT. It allows basic control of your Tesla vehicle via Home Assistant (currently, ju

4 May 23, 2022
ok-system-helper是一个简单的系统硬件的实时信息收集工具,使用python3.x开发

ok-system-helper ok-system-helper是一个简单的系统硬件的实时信息收集工具,使用python3.x开发,支持哪些硬件:CPU、内存、SWAP、磁盘、网卡流量。用户可在自己的项目中直接引入、开箱即用,或者结合flask等web框架轻松做成http接口供前端调用,亦可通过注

xlvchao 1 Feb 08, 2022
FHEM Connector for FHT Heating devices

home-assistant-fht from: https://github.com/Rsclub22 FHEM Connector for FHT Heating devices (connected via FHEM) Requires FHEM to work You can find FH

5 Dec 01, 2022
Make your MacOS keyboard brightness fade in and out

Make your MacOS keyboard brightness fade in and out. (It's working depends on the Kbrightness file, which only works for 2015 Macs, so this will only work on 2015 Macs.)

1 Dec 16, 2021
CircuitPython Driver for Adafruit 24LC32 I2C EEPROM Breakout 32Kbit / 4 KB

Introduction CircuitPython driver for Adafruit 24LC32 I2C EEPROM Breakout Dependencies This driver depends on: Adafruit CircuitPython Bus Device Regis

foamyguy 0 Dec 20, 2021
从零开始打造一个智能家居系统

SweetHome 从零开始打造智能家居系统的尝试,主要的实现有 可以扫码添加设备并控制设备的Android App 可以控制亮灭的灯,并可以设置在Android App连接到指定Wifi后自动亮起 可以控制开关的窗帘,机械结构部分自己设计并3D打印出来 树莓派主控,实现Http请求接口和ZigBe

金榜 5 May 01, 2022
A Fast, Easy, and User Friendly way to control Robotics Actuators.

T-Motor Controller A Fast, Easy, and User Friendly way to control Robotics Actuators. View Demo · Report Bug · Request Feature Table of Contents About

26 Aug 23, 2022
Control DJI Tello with Raspberry Pi and PS4 Controller

Control-DJI-Tello-with-Raspberry-Pi-and-PS4-Controller Demo of this project see

MohammadReza Sharifi 24 Aug 11, 2022
A python module for interacting with rolimon's, a roblox value site.

rpi - rolimon's python interaction rpi is an open source python-based rolimon's api wrapper. It provides an end-to-end pipeline in which each componen

Acier 11 Nov 08, 2022
Jarvis: a personal assistant which can help you to manage your system

Jarvis Jarvis is personal AI based assistant which can help you to manage stuff in your computer. This is demo but I decided to make it more better so

2 Jun 02, 2022
Ansible tools for operating and managing fleets of Blinksticks in harmony using the Blinkstick Python library.

Ansible tools for operating and managing fleets of Blinksticks in harmony using the Blinkstick Python library.

Greg Robinson 3 Aug 10, 2022
USB Rubber Ducky with the Rasberry Pi pico microcontroller

pico-ducky Install Install and have your USB Rubber Ducky working in less than 5 minutes. Download CircuitPython for the Raspberry Pi Pico. Plug the d

AnOnYmOus001100 3 Oct 08, 2022
Example code and projects for FeatherS2 and FeatherS2 Neo

FeatherS2 & FeatherS2 Neo This repo is a collection of code, firmware, and files

Unexpected Maker 5 Jan 01, 2023
Fener ROS2 package version 2

Fener's ROS2 codes that runs on the vehicle. This node contains basic sensing and actuation nodes for vehicle control. Also example applications will be added.

Muhammed Sezer 1 Jan 18, 2022
Python script for printing to the Hanshow price-tag

This repository contains Python code for talking to the ATC_TLSR_Paper open-source firmware for the Hanshow e-paper pricetag. Installation # Clone the

12 Oct 06, 2022
Imbalaced Classification and Robust Semantic Segmentation

Imbalaced Classification and Robust Semantic Segmentation This repo implements two algoritms. The imbalance clibration (IC) algorithm for image classi

24 Jul 23, 2022
Aqara Camera G3 integration for Home Assistant

Aqara Camera G3 integration for Home Assistant ATTENTION: The component only works after enabled telnet. Only supportd stream. Not support still image

14 Dec 18, 2022
A simple small scale electric car was build which can be driven by remote control and features a fully autonomous parking procedure.

personal-autonomous-parking-car-raspberry A simple electric car model was build using Raspbery pi. The car has remote control and autonomous operation

Kostas Ziovas 2 Jan 26, 2022
Hardware-accelerated ROS2 packages for camera image processing.

Isaac ROS Image Pipeline Overview This metapackage offers similar functionality as the standard, CPU-based image_pipeline metapackage, but does so by

NVIDIA Isaac ROS 52 Dec 15, 2022