Aziz is CTO at Tarjimly. He co-founded Tarjimly with Atif Javed in 2017 as a response to the Syrian refugee crisis and the US Refugees and Travel Ban. Tarjimly’s mission is to improve the lives of refugees and the efficiency of humanitarian services by eliminating language barriers.
The team is set out to enable any bilingual around the world to become a volunteer interpreter for 10 or 20 mins. To achieve this reach of volunteerism and efficiency in humanitarian impact, they are relying on democratizing access to data.
Tell us about Tarjimly; what inspired you to start the company?
I started the company with Atif Javed in 2017. We were both MIT engineering students, roommates and best friends. We moved to the bay area in 2015 and we’ve always had this conversation around how we can multiply the effect of our engineering skills into a product that really delivers a significant outcome onto the world.
Around this time, in January of 2017, the Executive Order on Refugees and Travel Ban was signed. So immediately, Atif’s and my Facebook feeds were flooded with messages like “Hey, I’m a lawyer. I’m now helping all these people that are stuck in airports, but I need a translator to help me communicate with them.”
At the same time we were working with an NGO in Greece trying to coordinate a volunteer trip, completely separately from what was happening in the US. We were trying to go to Greece and volunteer to help in a refugee camp. And one very surprising thing I remember was a call with that NGO, where they told me that if you speak Arabic, the best thing you can do for us is come and translate. So I immediately asked them, “Why do I have to go there? I can help you right now.” And they said, “That’s awesome. Let me add you to this Whatsapp group.”
So, we were thinking that this should be a solvable problem! And that’s when I started writing the first prototype of Tarjimly using Facebook Messenger. In a couple of weeks we put it online, and within 2 days after launching the site, 600 volunteers had signed up to be volunteer translators and were using Facebook Messenger to help. This was incredible and it gave us a really strong signal about people’s willingness to do good if they are provided with the opportunity.
This quickly led to our mission to improve the lives of refugees and immigrants that are facing a lack of support because of language barriers. We set out to completely break down these language barriers in the humanitarian sector by enabling any bilingual around the world to become a volunteer interpreter for 10 or 20 mins.
In 2019 our goal is to onboard 1 million volunteers to help 100,00 refugees.
Where are you now with Tarjimly and what is next?
In in 2018, we helped 15,000 refugees via 8,000 volunteer translators that speak over 92 languages in average 20-min interactions.
Looking into 2019, we just released our iOS and Android apps (in addition to Facebook Messenger) and our goal is to onboard 1 million volunteers to help 100,00 refugees. We also plan to build a premium experience for NGOs.
How are you using data across the company?
Data for us is part of the bloodline of how we operate. Fundamentally, when we started building our product we’ve had metrics embedded at the first layer. So we now have a very good understanding of our user journeys through collecting data points and analyzing them to really focus on how to improve the user experience.
An example of how our core operation depends on data is when an aid worker meets someone in need, and if they don’t speak their language, they need a translator. And one thing that’s clear is that they need this translator in real time. So they push the translator request to us and then we have to figure out where in the world that one person is that is most likely to be available in less than 2 minutes.
And how do we do that? We do that by analyzing the user’s previous interactions with the application as well as information about where they are in the world or what timezone they’re in. We then bring all these data points together and we have our model that basically predicts what volunteers are most likely to accept a request at any given time. So, we have this very sophisticated data-heavy machinery in the middle that tries to figure out how to best match a volunteer to the best opportunity that is out there, while managing and keeping in mind the time of the volunteer and what the specific need is.
Data for us is part of the bloodline of how we operate. We now have a very good understanding of our user journeys […] to really focus on how to improve the user experience.
Tell us some of the key operational metrics you are using.
A second dimension of how Tarjimly is using and analyzing data is around operations. We have this product out in the world and we have people and NGOs using it, so how do we figure out internally what the most valuable thing to do today is?
For example, if I’m launching a marketing campaign - and that’s something we’re doing today, we’re launching a marketing campaign to onboard new volunteers - we need to know which specific languages we should focus on. These types of operational question are 100% data-driven.
Are you democratizing access to data within Tarjimly especially since you are using data to drive operational decisions in different functions?
Starting from the engineering side, one thing we have to make sure is how we are collecting the data safely, how we are storing it securely, and how we have all the information that we might need when critical questions will be coming up to us.
But actually the most relevant part is how humans consume data and you know humans don’t consume data in databases. That’s not how I can make an operational decision. That’s not how anyone in my team can decide which feature to prioritize or which language to launch a marketing campaign for.
So very early in 2018 or even late 2017, we started to think of a really good way to visualize all this data and create consumable products that I can look at and make decisions. One of the first things we created was a Chartio dashboard that analyzes marketing campaign efficiency. And every morning I wake up, I go log into Chartio and see how we did yesterday and whether we were successful or not.
Another huge thing that I check almost daily is a Chartio dashboard that tells me the effectiveness of the matching within our Tarjimly model. As I mentioned before, we have a machine learning model that figures out who the best translator is for a particular need. And we have a team that’s actually building the model and improving it. As a leader in this organization, how do I evaluate how good our model is today compared to 3 weeks ago, and how do I know whether the latest greatest feature had this model impacted? The best way to do that is via a dashboard that allows me to visualize the outcome and success of my machine learning model.
At Tarjimly we rely entirely on Chartio to be this human interface to our complex data infrastructure. And all the different teams basically look at their own charts to know if they are doing a good job or not.
We rely entirely on Chartio to be this human interface to our complex data infrastructure. And all the different teams basically look at their own charts to know if they are doing a good job or not.
In your opinion, is there a “right” or “wrong” way to use data in modern business? What are some pitfalls?
I think this question has multiple dimensions. There’s one big dimension when you talk about the pitfalls of data when it comes to violating the trust of your partner - and that partner could be a user or could be an organization. Today, there’s a lot of discussion in the public around how we can effectively use data to make decisions and improve service without violating the trust of any one partner. Being responsible for data within an organization you have to always ask that question and you have to have a really good answer to it. And this fundamentally is largely around the privacy and security of your users.
Now internally within an organization, one of the most common pitfalls I have seen in my experience is around lack of context. People will treat data as the truth and rightfully so. But I think a lot of people miss the fact that when you record a specific piece of data, you’re not recording the full context of it. And what can happen here is that you could easily make completely wrong decisions by misinterpreting your own data, unless you have a very good understanding of the context around how your data was created. And this problem becomes harder with the larger an organization is. So, you need some data documentation and every one of your data points needs to have a story.
What is your view on the future of data analytics tools?
I think one one very common thing we’ve all seen and heard is how data can be part of this movement of data-driven decision making. We all enjoy the idea that our decisions are backed by evidence and that this evidence is data. I think the fundamental future of how data shapes organizations is the way data can become part of the transparency picture within organizations. So beyond using data as a tool for us to make data-driven decisions, how can the fact that we have first class data infrastructure allow everyone to be infinitely more transparent when it comes to decisions in a way that’s very safe and secure? Allowing that data to speak for the transparency of the decision making process you have within an organization is the type of next level requirement that I want to see in the future.