Hashtags have been a staple of the social media experience for almost 15 years. They’ve introduced us to new communities, driven brand recognition to thousands of organizations, and simplified our quest for finding content. They exponentially helped with the task of social media organization. But what if what you’re searching for isn’t popular enough to have a specific hashtag? What if you don’t know what hashtag to search for?

At UC Berkeley’s Sutardja Center for Entrepreneurship and Technology, six students are attempting to fix this gap in our search. Using a concept known as Unsupervised Multimodal Matching, they’re introducing concepts including natural language processing and unsupervised learning to extract features from both text and images, allowing users to simply upload an image or input some text and immediately receive relevant posts.

“The current state of search functionality in many social websites leaves a lot to be desired” says Kyle Atkinson, a senior at UC Berkeley. “You’re incredibly limited in how you search, and when you’re not 100% sure what you’re looking for, your search can take much longer than anticipated – or possibly never get you to what you’re looking for. That’s why we started looking into how we can transform the back end of how search works.”

The group aims to create a B2B product that can be adopted by search forums, social media sites, and eventually, even retailers. Soon, larger retailers will be able to take a picture of an item and immediately index a database to find price, shelf location, and more. 

Project By: Coco Sun, Diru Jia, Tianxiao Gaoqu, Jiang Qu, Kyle Atkinson, Wenqi Kou

Industry Mentor: Elaine Pang