Particle is the most widely-used Internet of Things (IoT) platform. We’ve built the largest community of developers and engineers in the industry, and we’ve shipped more than hundreds of thousands of development kits to 170 countries. Hundreds of IoT products, from smart home appliances to industrial equipment, are being built on Particle’s platform. We are committed to making it easy for anyone to get their products online, and we support product creators of any size — from Fortune 500s to blossoming entrepreneurs.
Investment in our people is critical to who we are as a company. Health and safety of our employees is our number one priority. We have updated our reimbursement policy for certain work-related expenses to ensure employees have the equipment and resources they need to work from home effectively. From extensive onboarding to 360 feedback, you’ll have the tools to navigate your own success from the start. From there, you’ll work with your manager to create your own growth path and get the coaching needed to achieve it. Our people get:
- Competitive health, dental, and vision benefits
- Stock options
- Flexible and open vacation policy
- Work from home allowance
- Generous parental leave policy
About the Role
As Particle’s first data engineer, you will work closely with both business and technical stakeholders and shape the way that we store and retrieve data efficiently. As someone passionate about data, you will build and implement a large-scale data platform that will help engineering, product, sales, marketing, operations teams make use of the data they collect. As you define best practices for making data-driven decisions across the organization, you will have an immediate impact on the future growth and success of Particle.
- Manage the flow of data across multiple sources and build/use tools that reliably extract and load data from various sources.
- Decide what metrics are worth tracking, work with technical and non-technical teams to instrument data, and provide visibility to people all around the company.
- Validate and invalidate key assumptions through the democratization of data.
- Build coherent, version-controlled data models in Tableau that empower the entire company to make sense of and make key business decisions using a data-driven approach.
- Empower operational teams by integrating key actionable data points into load-bearing operational CRMs, including but not limited to Marketo, Salesforce, Netsuite.
- At least 10 years of experience building and maintaining big data pipelines and/or analytical or reporting systems at scale.
- Deep and extensive experience with the following technologies:
- SQL and scripting experience with one of the following: Ruby, Node.JS, or Python
- MongoDB (preferred) or other non-relational databases
- Stitch or Segment and DBT
- Snowflake (highly preferred but not required)
- Tableau (preferred) or a similar BI Analytics tool
- Machine Learning skills a plus!
- A solid foundation in data modeling, experience with data transformation and in-depth knowledge of distributed systems.
- A passion for building and defining a strong foundation of using data-driven approaches to make key business decisions cross-functionally throughout an organization.
We are inspired by our mission of enabling the next technological revolution, and by the incredible products that people build on our platform. We also support our Particle family and community by embracing our core values of caring, curiosity, openness, joy, and trust.
Everyone is welcome at Particle. We are an equal opportunity employer who values diversity and inclusion and look for applicants who understand, embrace and thrive in a multicultural world. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Pursuant to the SF Fair Chance Ordinance, we will consider employment for qualified applicants with arrests and conviction records.