Software for new hardware is what we do best, and our engineering team is constantly experimenting with new technologies. Currently, our team is excited to work on VUI (voice) projects and expand our AI/ML practice. We are working very closely with industry AR/VR leaders and undertaking interesting and unique applications of these technologies.
We are fans of open source projects and believe in not reinventing the wheel. We use a new PHP7 framework called Laravel to power our API layer. It gives PHP super-powers like command line tools, dependency injection, ORM, queues and schedules, event listeners, and so forth. We also use Vue.js to power our front-end application in addition to other technologies like Redis, MongoDB, MySQL, Jenkins, Express and node.js.
1 Open Positions
We’ve doubled down on machine learning and data science in order to maintain a massive data catalog (the largest grocery catalog ever), build our customer and shopper apps, identify lost demand in our fulfillment chain, and solve a souped-up version of the classic traveling salesman problem. There is no shortage of interesting models to improve, algorithms to optimize, and problems to solve.
Millions of customers buy their groceries on Instacart. Our backend systems support tight integrations with the largest retailers in North America and our engineers are working to scale operations across our iOS, Android, and web applications. We currently use Rails, Ruby, Python, R, PostgreSQL 9.6, React 0.17, AWS, Docker, RabbitMQ, Sidekiq, Snowflake, PostgreSQL, Stripe, Twilio, Mapbox, and SiftScience, but don’t require folks to have experience with our stack. If you have a solid sense for basic languages and are eager to learn new ones, please reach out!
37 Open Positions
Our stack includes Ruby on Rails, NodeJS/Serverless, and DynamoDB, just to name a few. We aim to increase developer velocity by using new technologies with large community resources. We realize that building applications with dated technology discourages other people from wanting to work with us and lowers our educational resources online. Open source technology has allowed us to get answers to our questions faster, which in turn has helped us build faster. We ship code everyday and are constantly rolling out new features to our customers. We decide what needs to be built and then we look at which technology will get us there the fastest. You may not have experience with the Serverless Framework, but as long as you are open to learning it, we are open to teaching it to you!
As a team, we’re very open to adopting new technologies. In fact, we’re constantly looking for the best tools for the job and if we can’t find it, we’ll develop our own (we developed our own ink engine and also built the foundation for collaboration). While we are very deliberate about the technologies we use, we don’t require engineers to have experience with any of them beforehand. We believe that anyone can learn a new language on the job. If anything, one of the most important attributes we look for in candidates is having a growth mindset.
There are some truly remarkable technological advancements in medicine. We can somehow use DNA sequencing and bioinformatics to analyze the function and structure of entire genomes, but we still can’t seem to send you your medical records from a previous doctor without faxing you your documents page by page. To solve this problem, we are leveraging machine learning to help patients and their families manage their medical records. In 2017, we won Google Cloud’s Machine Learning Startup Competition. To date, we have collected and structured records from over 25,000 different health care facilities in the US. We have built and deployed the models that allow us to extract, label, and structure handwritten doctors’ notes, labs, and images. For anyone who’s ever tried to read their doctor’s notes, you can understand just how complicated and amazing this is!
Visual communication and accountability tools for construction, roofing, and more
Contractors need to know what they are looking at is accurate and up-to-date, so we focus on making sure our back-end performance is solid. This enables us to provide a simple-to-use user experience. On the web, we use Ruby on Rails with a React front-end. We also use Sass, Webpacker, Postgres, Mongo, Kafka, Amazon S3, AWS Lambda, Terraform, and Docker, to name a few more. Our mobile app is primarily built with React Native alongside some custom native libraries in Java, Objective-C, Swift, and C++ for core functionalities. You'll also find Apollo and GraphQL, along with Redux for our local state management.
1 Open Positions
At EyeLevel.ai, we are re-imagining many components of the digital advertising technology stack for a fast-approaching future where the interfaces to our computers are conversational. Advertising through these new interfaces will resemble personalized, contextually relevant brand recommendations. In order to deliver an advertisement through conversation, we must accurately interpret and classify what users are saying through colloquial spoken or written language. Then we have to maintain a sufficiently rich context: not just what has been said last but also what has been said in the past, what the user is interested in, what their current emotional states is, etc. We must then match this context to brand recommendations with a high degree of accuracy. Failing to do so is not just ineffective for the brands, but also potentially upsetting for the users and our developer partners.
To solve these challenges, we use some of the most well-known machine learning and natural language processing tools, frameworks, and algorithms (RNN, Tensor Flow, etc.). But that’s not enough. We are also inventing solutions to bridge the gaps in these relatively new, evolving technologies. At EyeLevel.ai, there is an opportunity to make contributions to the machine learning community and to push the ecosystem forward in meaningful ways. The very nature of the market and challenges we are facing as a company demand it.
An example of how we leverage cutting-edge tech is by leveraging distributed messaging queues and our team is vertically integrated.
We build highly available and scalable backend infrastructure with a focus on performance in order to provide the best possible user experience to our products. In order to achieve this, we’re leveraging the latest and greatest technologies of a modern tech stack. Some examples of these technologies include: AWS Lambda, Apollo GraphQL, AWS Kinesis Streams, Prisma ORM, and AWS Aurora Postgres. We have a distributed architecture using a mix of point-to-point API requests between different lambda instances as well as using Kinesis Streams as our distributed messaging queue with resilient retry mechanisms from our integrations.
Some exciting projects that we’ll be working on in the future are building out infrastructure to become a featured bank for 3rd-party integrations, building a more flexible rewards pipeline, implementing systems which will allow us to offer a full suite of banking-related products and breaking out our services into a true microservice infrastructure.
Want to List Your Company?
Submit a team profile!
Select 8 Values
Contact me (Lynne 👋)
Qualify Your Values
Reach Thousands of Devs
Find Value-Aligned Candidates
Have Meaningful Initial Conversations
Don't Fill Roles, Hire Teammates
You can post as many job openings as you want.