Seismic

Software Engineer II

Job Locations IN-Hyderabad
Job ID 2025-1947
Category
Product and Engineering
Employment Type
Full-Time
Remote
No

About Us

Please be aware we have noticed an increase in hiring scams potentially targeting Seismic candidates. Read our full statement on our Careers page.
 
Seismic is the global leader in AI-powered enablement, empowering go-to-market leaders to drive strategic growth and deliver exceptional customer experiences at scale. The Seismic Enablement Cloud™ is the only unified AI-powered platform that prepares customer-facing teams with the skills, content, tools, and insights needed to maximize every buyer interaction and strengthen client relationships. Trusted by more than 2,000 organizations worldwide, Seismic helps businesses achieve measurable outcomes and accelerate revenue growth. Seismic is headquartered in San Diego with offices across North America, Europe, Asia and Australia. Learn more at seismic.com.
 
Seismic is committed to building an inclusive workplace that ignites growth for our employees and creates a culture of belonging that allows all employees to be seen and valued for who they are. Learn more about DEI at Seismic here.

Overview

At Seismic, we're proud of our engineering culture where technical excellence and innovation drive everything we do. We're a remote-first data engineering team responsible for the critical data pipeline that powers insights for over 2,300 customers worldwide. Our team manages all data ingestion processes, leveraging technologies like Apache Kafka, Spark, various C# microservices services, and a shift-left data mesh architecture to transform diverse data streams into the valuable reporting models that our customers rely on daily to make data-driven decisions.  Additionally, we're evolving our analytics platform to include AI-powered agentic workflows. 

 

Who You Are: 

  • Have working knowledge of one OO language, preferably C#, but won’t hold your Java expertise against you (you’re the type of person who’s interested in learning and becoming an expert at new things).  Additionally, we’ve been using Python more and more, and bonus points if you’re familiar with Scala. 

  • Have experience with architecturally complex distributed systems. 

  • Highly focused on operational excellence and quality – you have a passion to write clean and well tested code and believe in the testing pyramid.     

  • Outstanding verbal and written communication skills with the ability to work with others at all levels, effective at working with geographically remote and culturally diverse teams. 

  • You enjoy solving challenging problems, all while having a blast with equally passionate team members. 

  • Conversant in AI engineering.  You’ve been experimenting with building ai solutions/integrations using LLMs, prompts, Copilots, Agentic ReAct workflows, etc. 

Who you are:

  • BS or MS in Computer Science, similar technical field of study, or equivalent practical experience. 

  • 3+  years of software development experience within a SaaS business. 

  • Must have a familiarity with .NET Core, and C# and frameworks. 

  • Experience in data engineering - building and managing Data Pipelines, ETL processes, and familiarity with various technologies that drive them: Kafka, FiveTran (Optional), Spark/Scala (Optional), etc. 

  • Data warehouse experience with Snowflake, or similar (AWS Redshift, Apache Iceberg, Clickhouse, etc). 

  • Familiarity with RESTFul microservice-based APIs 

  • Experience in modern CI/CD pipelines and infrastructure (Jenkins, Github Actions, Terraform, Kubernetes) a big plu (or equivalent) 

  • Experience with the SCRUM and the AGILE development process. 

  • Familiarity developing in cloud-based environments  

  • Optional: Experience with 3rd party integrations 

  • Optional: familiarity with Meeting systems like Zoom, WebEx, MS Teams 

  • Optional: familiarity with CRM systems like Salesforce, Microsoft Dynamics 365, Hubspot. 

What you'll be doing:

    • Collaborating with experienced software engineers, data scientists and product managers to rapidly build, test, and deploy code to create innovative solutions and add value to our customers' experience. 

    • Building large scale platform infrastructure and REST APIs serving machine learning driven content recommendations to Seismic products. 

    • Leveraging the power of context in third-party applications such as CRMs to drive machine learning algorithms and models. 

    • Helping build next-gen Agentic tooling for reporting and insights 

    • Processing large amounts of internal and external system data for analytics, caching, modeling and more. 

    • Identifying performance bottlenecks and implementing solutions for them. 

    • Participating in code reviews, system design reviews, agile ceremonies, bug triage and on-call rotations.  

Job Posting Footer

If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please click here. 

 

Headquartered in San Diego and with employees across the globe, Seismic is the global leader in sales enablement, backed by firms such as Permira, Ameriprise Financial, EDBI, Lightspeed Venture Partners, and T. Rowe Price. Seismic also expanded its team and product portfolio with the strategic acquisitions of SAVO, Percolate, Grapevine6, and Lessonly. Our board of directors is composed of several industry luminaries including John Thompson, former Chairman of the Board for Microsoft.  

 

Seismic is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to gender, age, race, religion, or any other classification which is protected by applicable law.   

 

Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice. 

Linkedin Posting Section

#LI-ST1

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.