Neural Networks

Carepal

In a whirlwind of technology and innovation, I recently had the privilege to participate in a Hackathon at Georgian College, a gathering that proved to be both a challenge and an exhilarating opportunity to push the boundaries of artificial intelligence (AI) in solving real-world problems. Presented with four themes—information security, healthcare, smart cities, and sustainability—my team and I embarked on a journey to find a meaningful application of AI that could make a tangible difference in people’s lives. After several hours of brainstorming and discussion, we were drawn to healthcare, a field where AI’s potential to improve lives is both vast and deeply personal. The spark for our project came from a simple yet profound observation: a team member shared how assisting a senior neighbor with technology brought her immense joy and highlighted a critical need—many seniors live alone, often without the assistance they need. With over 42% of Canadian seniors living alone, we saw a clear opportunity to make a difference. Thus, CarePal was born. CarePal is not just another piece of technology; it’s a proactive AI companion designed to perform wellness checks, ensure medication adherence, provide company, detect behavioral trends, and alert caregivers to emergencies or anomalies. What sets CarePal apart is its unparalleled accessibility, offering connectivity across various devices to accommodate seniors with audio, visual, or speech impairments. Leveraging the power of COHERE’s API—a Canadian enterprise specializing in generative AI solutions—we equipped CarePal with a large language model enhanced by retrieval augmented generation. This foundation allows CarePal to offer not just interaction, but truly insightful and helpful engagement, tailored to the unique needs of seniors. Developing CarePal was a marathon of innovation, requiring around 20 hours of dedicated work. Our team was a blend of talents, divided into three key roles: Hackers: The tech wizards who brought the first prototype of CarePal to life. Business Development: That’s where I contributed, diving into business research, branding, and development to ensure CarePal’s market readiness and impact. The Hustler: The charismatic force who pitched our product, presenting CarePal’s potential to transform senior care. Our journey culminated in the Hackathon’s finals, where CarePal was awarded second place—a moment of immense pride and validation for our hard work. But beyond the accolades, the experience was a profound reminder of the power of technology to make a difference in the lives of those who need it most. As we move forward, our experience at the Georgian College Hackathon remains a beacon of what’s possible when innovation meets empathy. CarePal is just the beginning. The journey of using technology to enhance human lives is endless, and I am eager to continue on this path, wherever it may lead. For a closer look at our pitch and the story of CarePal, check our pitch video here.  

Carepal Read More »

Understanding Neural Networks using Math and Numpy

In this blog post, I’ll take you through my project journey, where I embarked on building a neural network from scratch using only NumPy. This endeavor was not just a programming exercise but a deep dive into the underpinnings of machine learning models. Whether you’re a seasoned coder or new to the tech world, I hope my experiences and insights can shed some light on the fascinating world of neural networks. The Genesis The first step was to import the necessary Python libraries: NumPy for numerical computations, Pandas for data manipulation, and Matplotlib for visualizing the data. These tools are staples in the data science toolkit, providing a robust foundation for handling and analyzing complex datasets.   My dataset resided in a CSV file, containing pixel values of handwritten digits along with their corresponding labels—a perfect dataset for a classification task. Using Pandas, I loaded the data and took a peek at the first few rows with df.head(). Each row represented an image of a handwritten digit, with the first column being the label (the digit) and the following 784 columns (28×28 pixels) the pixel values. Preprocessing The raw data needed to be converted into a format suitable for the neural network. I transformed the Data Frame into a NumPy array to leverage NumPy’s powerful numerical operations. Recognizing the potential for overfitting, I partitioned the data into training and cross-validation sets. This split would later help in tuning the model’s parameters to improve its generalization capabilities. The Neural Network The core of my project was the implementation of a simple yet effective neural network. The network consisted of two layers: a hidden layer and an output layer, each with its weights and biases. The initialization of these parameters was random, adhering to the principle that starting points matter in the journey of optimization. Activation Functions: Bringing Non-linearity To introduce non-linearity, I used the Rectified Linear Unit (ReLU) function for the hidden layer and the softmax function for the output layer. These choices are common in neural network architectures due to their computational efficiency and effectiveness in model training. Forward Propagation: A Leap of Faith The forward propagation process involved calculating the linear transformations and activations for both layers. This step was crucial for generating predictions from the input data.   Backward Propagation: Learning from Mistakes Learning in neural networks occurs through backward propagation. By comparing the predictions with the actual labels, I computed gradients for the weights and biases. This information directed how to adjust the parameters to reduce the error in predictions. Iterative Optimization The training process was iterative, employing gradient descent to update the parameters in small steps. Each iteration brought the model closer to its goal—minimizing the loss function and improving its accuracy on the training data. Results After numerous iterations, the model’s performance on unseen data (the cross-validation set) was promising. This success was a testament to the power of simple neural network architectures when armed with the right techniques and a systematic approach. This project was a profound learning experience, demystifying the workings of neural networks and reinforcing the importance of foundational principles in machine learning. It’s a journey that has just begun, with endless possibilities and challenges ahead. I hope this account of my project has illuminated some aspects of neural networks and inspired you to embark on your own projects. The blend of theory and practice in machine learning is a powerful tool for solving complex problems, and it’s within reach for anyone willing to learn and explore. Check out more projects on my Github profile: https://github.com/TirtheshJani/NN-with-math-and-numpy

Understanding Neural Networks using Math and Numpy Read More »