About

I am Lethabo (Scofield), an AI/ML Engineer specializing in neural networks and machine learning systems. My work focuses on understanding how models perceive and represent information, reason across inputs, and adapt to complex environments, including reinforcement learning (RL) techniques such as those described in Proximal Policy Optimization (PPO).

My experiments are guided by state-of-the-art research. For example, in representation learning, I reference Transformers for attention-based models. In vision-language alignment, I study ALBEF: Align Before Fuse and CLIP: Contrastive Language-Image Pre-training. Each experiment integrates these insights to test model reasoning and semantic understanding across modalities.

I run Olyxee, a project dedicated to deploying neural networks on heterogeneous edge devices. The approach takes inspiration from research on model quantization and optimization for limited-resource environments, such as Deep Compression.

This site documents experiments, research notes, and small projects. Each post links directly to the research foundations behind the work, showing how theoretical insights are applied in practical neural network design and multimodal reasoning.

Focus Areas

Current Work

My ongoing experiments focus on multimodal representation learning. The goal is to connect images and language in a shared mathematical space. I generate embeddings for both modalities using approaches inspired by CLIP and ALBEF. These embeddings are stored in vectors for fast retrieval and analysis, allowing me to test semantic alignment and reasoning across modalities.

Contact

For collaboration, research discussions, or project inquiries, reach out through the Connect section on the homepage. I respond to questions related to experiments, neural system design, and deployment frameworks.