


Thermal and Proximity Sensing in Humanoid Robot Hand

About Me
Research & Projects.
Robotic Sensor Design
Tactile Sensing Artificial Fingertip for Prosthetics and Humanoids
Developed for human hand prosthetics and humanoid robotic hands, offering high-density touch sensitivity in a compact form factor, similar in size to a typical human fingertip. This fingertip was developed during my research work in tactile sensing.

In this project, an origami technique is employed to fold a flat, paper-like flexible tactile sensing module into a three-dimensional shape that mimics the form of a human fingertip. Sensor data acquisition is handled by an onboard 32-bit, low-power microcontroller, and the module communicates using the SPI protocol.
In this project, an origami technique is employed to fold a flat, paper-like flexible tactile sensing module into a three-dimensional shape that mimics the form of a human fingertip. Sensor data acquisition is handled by an onboard 32-bit, low-power microcontroller, and the module communicates using the SPI protocol.
The development of this module presented a number of challenges, particularly in mechanical design and electronics. Prior to this project, I had limited experience with mechanical Computer-Aided Design (CAD). I had to learn how to design both mechanical structures and electronic schematics, as well as manage the manufacturing and integration processes.
Despite these challenges, a functional prototype of the tactile fingertip was developed. While the current version is not yet optimized for production, it is capable of capturing tactile data. Ongoing work includes further optimization of the design, data collection, and analysis.
The dimensions of the fingertip module closely match those of a typical human fingertip, enabling applications in humanoid robots and prosthetic systems.





Figure 6: Shows the high-density tactile fingertip module mounted on a human finger. This demonstration is for illustrative purposes only. While the module is designed for integration into a humanoid robot hand, I did not have access to the robot hand toward the end of the project. I plan to update this demonstration with a robotic hand implementation in the near future.
- Detailed experimental results
- Data analysis and sensor performance evaluations
I haven't published the data, schematics, or full documentation yet, as I believe the project isn’t complete enough to share publicly. Once it reaches a point I consider satisfactory, I will make those materials available.
Firmware Library
Thermal and Proximity Sensing in Humanoid Robot Hand
Firmware Library for Acquiring Thermal and Proximity Sensor Data from a Robotic Hand Designed by a London-Based Firm
Sensor Design
Tactile Sensing Module with a Dense Arrangement of Sensors
This tactile sensing module was developed to evaluate the application of closely arranged barometer sensors in tactile sensing.
This module features a high-density arrangement of 36 MEMS barometric pressure sensors in a compact 50x65 mm form factor, with potential applications in robotics and other fields. The goal is to explore the sampling frequency, potential hysteresis issues, and other influencing factors. The module utilizes 36 MEMS barometric pressure sensors. In addition to this hardware, a separate controller module (GitHub) has been developed to interface with the module and sample data.
At the moment, only part of the tactile sensing hardware is populated and tested, as I’ve been occupied with other projects. I plan to resume work and publish the complete version along with test results when I have more time.
Figure 1: Shows the partially populated tactile sensing module with MEMS barometric pressure sensors.


The schematics, evaluation results, and additional details will be shared once testing and evaluation are complete. I’m documenting the progress so far, but since this is a personal project, I need to prioritize other commitments for the time being.
Possible Applications:
Possible applications of this tactile sensing module include robotics, touchpads, gaming interfaces, and other fields that can benefit from tactile sensing.
Embedded Controller
Tactile Controller
A 32-bit embedded controller module developed to efficiently retrieve and process tactile sensor data.
- Microcontroller: PIC32MM0256GPM036 (UQFN-40)
- Protocols Supported:
- I²C
- Hardware SPI (HW SPI)
- Software SPI (SW SPI)
- Application:
- Interfaces with high-density tactile sensor arrays
- Suitable for applications requiring customizable sampling and interfacing logic

https://github.com/neoviki/tactile.controller.module
Computer Vision and Robotic Navigation
Vacuum Cleaner Robot: Object Detection and Navigation
One of my projects involved a vacuum cleaner robot developed by a German consumer appliances firm, where I focused on computer vision and navigation for a differential wheel drive robot.
This is a proprietary project, and I do not have permission to share the source code or other related details.
AI - Computer Vision
Real-Time Traffic Monitoring Using Darknet (YOLO)
This application uses the YOLO (You Only Look Once) object detection algorithm for real-time traffic monitoring. The system processes video streams using the Darknet framework to detect various objects, such as vehicles, pedestrians, and traffic signs. The model is trained on the Microsoft COCO dataset, which includes a wide variety of annotated images for object detection tasks. With YOLO’s speed and accuracy, this traffic monitoring system is well-suited for smart city applications, traffic flow analysis, and vehicle counting. Further optimizations and techniques, such as transfer learning, can be applied to enhance performance for specific traffic scenarios.
This project was developed for a client who required real-time counting of vehicles crossing a traffic signal. I used the YOLO (You Only Look Once) object detection algorithm, implemented through the Darknet framework. In the prototype version, the AI logic runs on an NVIDIA Jetson Nano.
The system processes video streams to detect and classify objects such as cars, pedestrians, and traffic signs, using a pretrained YOLO model based on the Microsoft COCO dataset. YOLO is a deep learning model that applies a convolutional neural network to perform object detection quickly and accurately in a single pass, making it well-suited for real-time applications. In this project, I used the model as-is for the car counting task, without training the network or refining it further.

The vehicle counting functionality is proprietary and was delivered only to the client. However, the remaining implementation has been made public on GitHub. The published version is configured and tested on Ubuntu 16.04 using a ThinkPad T420.
- Parking lot occupancy detection
- Pedestrian flow monitoring
- CCTV-based object recognition for security
- Traffic anomaly detection (e.g., wrong-way driving)
- Real-time lane usage and congestion tracking
Darknet by Joseph Redmon : https://pjreddie.com/darknet/
Microsoft COCO Dataset : http://cocodataset.org/#home
AI - Computer Vision
Handwritten Digit Recognition App Using CNN
An application that recognizes handwritten digits using a Convolutional Neural Network (CNN). Trained on the MNIST dataset, the model is integrated with a user-friendly GUI built using PyTkinter.
A simple and intuitive application that recognizes handwritten digits using a Convolutional Neural Network (CNN). The model is trained on the MNIST dataset and deployed with a user-friendly PyTkinter GUI.

To get started with the Handwritten Digit Recognition App, follow these steps:
- Clone the repository
- Navigate to the project directory
- Install the required dependencies
- Run the application
git clone https://github.com/neoviki/ai.vision.handwritten.digit.recognizer.cnn.mnist.git
cd ai.vision.handwritten.digit.recognizer.cnn.mnist
chmod +x install.dependencies.sh
./install.dependencies.sh
chmod +x run.app.sh
./run.app.sh
@article{lecun2010mnist,
title={MNIST handwritten digit database},
author={LeCun, Yann and Cortes, Corinna and Burges, CJ},
journal={ATT Labs [Online]. Available: http://yann.lecun.com/exdb/mnist},
volume={2},
year={2010}
}
For more details on this project, visit https://github.com/neoviki/ai.vision.handwritten.digit.recognizer.cnn.mnist
Computer Networks - IoT
Client-Server Architecture for IoT Data Transmission over HTTP/HTTPS
A client-server architecture for transmitting data packets over HTTP/HTTPS, designed for IoT applications. This setup is used for homegrown plant monitoring and vehicle status tracking, offering a cost-effective alternative to MQTT for lightweight, low-traffic communication needs. It’s ideal for simple IoT ecosystems where frequent or intensive data exchange isn't necessary.
For more details on this project, visit https://github.com/neoviki/iot.data.exchange.over.https
Other Projects
List of Other Projects I have Worked On
I’ve worked on a variety of projects based on my interests and the challenges I encountered along my journey. These projects span across AI, Computer Networking, Linux, GUI, Terminal tools, as well as Robotics and Electronics. I’ve documented and published them on GitHub. This section includes a list of those projects along with their corresponding GitHub URLs.
I've had the opportunity to work on projects across a range of fields, including telecom, vehicle telematics, IoT, computer vision, and robotic sensor design - covering areas such as computer science, electronics, and robotics. These projects were practical solutions to challenges I encountered in both day-to-day computing and industry environments. They include Linux utilities, drivers that optimize and streamline personal and industrial workflows, Artificial Intelligence driven applications and command-line tools using `awk`, `sed`, and regex for data splitting, filtering, manipulation, and interpretation. I've also developed C/Python libraries and packages, as well as custom hardware modules such as tactile sensors for robotic sensing.
Below is a list of selected projects, all of which I've published on GitHub.
Nr | Project Details | Reference |
---|---|---|
1 | ai.vision.realtime.traffic.monitoring | Link |
2 | flexible.tactile.sensing.module.dual.sensors | Link |
3 | tactile.controller.module | Link |
4 | ai.vision.handwritten.digit.recognizer.cnn.mnist | Link |
5 | ai.vision.image.classifier.cnn.cifar10 | Link |
6 | ai.ml.classification.knn | Link |
7 | ai.vision.face.detector | Link |
8 | linux.embed.dir.to.bash.script | Link |
9 | linux.over.the.air.update.utility | Link |
10 | linux.remote.debugger.utility | Link |
11 | linux.rpi.samba.file.share | Link |
12 | linux.rpi.vnc.server | Link |
13 | linux.rpi.wifi.access.point | Link |
14 | linux.wifi.monitor | Link |
15 | compress | Link |
16 | video.to.gif | Link |
17 | versionsnap | Link |
18 | graphical.interface.grid.eye.sensor.electron | Link |
19 | graphical.interface.gyroscope.sensor | Link |
20 | graphical.interface.robotic.gripper | Link |
21 | graphical.interface.serial.monitor | Link |
22 | iot.data.exchange.over.https | Link |
23 | eagle.library.pic32mm0256gpm036.uqfn.40pin | Link |
24 | eagle.library.pic32mm0256gpm028.uqfn.28pin | Link |
25 | arduino.AMG8833.grid.eye.sensor | Link |
26 | arduino.Bosch.BMP384.pressure.sensor | Link |
27 | arduino.VL6180X.time.of.flight.distance.sensor | Link |
28 | arduino.esc.controller | Link |
29 | audio.joiner | Link |
30 | mp3.multiplier | Link |
31 | automate.ceph.distributed.storage.setup | Link |
32 | automate.transaction.on.ebay | Link |
33 | crop.image | Link |
34 | docker.wrapper.commands | Link |
35 | explainable.ai.evalution.app | Link |
36 | expose.tcp.port | Link |
37 | gps.coordinates.renderer.osm | Link |
38 | ipc.filelock | Link |
39 | ipc.messenger | Link |
40 | ipc.shared.memory | Link |
41 | mac.osx.tabrun | Link |
42 | netcopy | Link |
43 | proxmox.automation.utilities | Link |
44 | rand.file.tagger | Link |
45 | responsive.css.layouts | Link |
46 | shortest.path.finder.sim | Link |
47 | terminate | Link |
48 | translate.textfile | Link |
49 | uscreen | Link |
50 | video.splitter | Link |
51 | website.backup.and.restore.utility | Link |
52 | html.ui.generator | Link |
53 | ui.python.tkinter.wrapper | Link |
About Me.

I am Viki (officially VN, representing the initials of my first and last name). By profession, I am a software architect specializing in embedded systems and IoT, with over a decade and a half of experience in the software industry, including nearly decade at a leading California-based computer networking company famously associated with the San Francisco Bridge (a hint, if you're curious about the company). In addition, I have worked as a consulting software architect and manager for more than five years, supporting clients across Europe and the UK. I have collaborated with and led teams of engineers from diverse ethnic and cultural backgrounds, including professionals from France, Belgium, the UK, Russia, and South Asia. I have built solutions including robotic sensors, artificial intelligence applications, embedded firmware, and Linux-based software for industries such as robotics, computer networking and telecommunications, IoT, and automotive engineering.
My academic background includes a master’s degree from Germany, specializing in Computer Science and Robotics, which provided me with a fundamental understanding of robotics, sensors, and the building blocks of artificial intelligence. During my studies, I had the opportunity to work with robots, from humanoids to autonomous vacuum cleaner robots, as well as in artificial intelligence areas such as computer vision and explainable AI.
One of my most significant achievements is the development of a high-density tactile fingertip module for prosthetic hands and humanoid robots, which is detailed on my research and projects page. In my home lab, I have created over 50 open-source projects, ranging from embedded hardware and AI applications to over-the-air update utilities, Linux utilities, and sensor modules, all of which are documented on my GitHub ( check my project list page).
I have a strong interest in computing, electronics, and robotics, and I enjoy researching their interdisciplinary applications, which can be applied to various domains to improve existing processes in the target domain, thereby optimizing both efficiency and cost.
My technical skills include programming in C and Python, and I have developed applications in various other languages, adapting to each based on project requirements. In addition to programming, I actively work on data analysis and organization using the awk programming language, as well as concepts like regular expressions and relational databases, which I frequently apply for debugging, data parsing, and structuring information.
Alongside my software and data analysis work, I have practical expertise in using the KiCAD EDA tool and CAD modeling with Solid Edge, particularly for embedded systems and robotic sensor module design.
Contact Me.
You can get in touch with me at hello@viki.design