$120K - 160K a year
Design and develop data science solutions to analyze large datasets, build analytics products, communicate insights, and support business decisions.
Expert SQL, Python, Apache Spark, AWS, DataBricks skills with 10+ years relevant experience and a Bachelor's degree or equivalent.
Job Title: Data Scientist 5 - Data Analytics Location: Beaverton, OR (100% Remote) Duration: 8+ Months Project Description: • Client’s Marketplace Coverage Correction Factors (MCCF) product is a data science solution designed to estimate total marketplace sales at a detailed product level, particularly in areas where Client* does not have direct access to retailer point-of-sale (POS) data. • The MCCF product leverages advanced modeling to “gross up” known sales data from mapped accounts and predict sales for unmapped accounts, helping Client gain a comprehensive view of marketplace performance. • This project is essential for supporting business decision-making and optimizing Client’s marketplace strategy. Job Description: • Designs, develops and programs methods, processes, and systems to consolidate and analyze structured/unstructured, diverse “big data” sources to generate actionable insights and solutions for client services and product enhancement. • Builds "products" for Analysis. • Interacts with product and service teams to identify questions and issues for data analysis and experiments. • Develops and codes software programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from multiple disparate sources. • Identifies meaningful insights from large data and metadata sources; interprets and communicates insights and findings from analysis and experiments to product, service, and business managers. • Lead to the accomplishment of key goals across consumer and commercial analytics functions. • Work with key stakeholders to understand requirements, develop sustainable data solutions, and provide insights and recommendations. • Document and communicate systems and analytics changes to the business, translating complex functionality into business relevant language. • Validate key performance indicators and build queries to quantitatively measure business performance. • Communicate with cross-functional teams to understand the business cause of data anomalies and outliers. • Develop data governance standards from data ingestion to product dictionaries and documentation. • Develop SQL queries and data visualizations to fulfill ad-hoc analysis requests and ongoing reporting needs leveraging standard query syntax. • Organize and transform information into comprehensible structures. • Use data to predict trends and perform statistical analysis. • Use data mining to extract information from data sets and identify correlations and patterns. • Monitor data quality and remove corrupt data. • Evaluate and utilize new technologies, tools, and frameworks centered around high-volume data processing. • Improve existing processes through automation and efficient workflows. Build and deliver scalable data and analytics solutions. • Work independently and take initiative to identify, explore and solve problems. Design and build innovative data and analytics solutions to support key decisions. • Support standard methodologies in reporting and analysis, such as, data integrity, unit testing, data quality control, system integration testing, modeling, validation, and documentation. • Independently support end-to-end analysis to advise product strategy, data architecture and reporting decisions. Requirements: • Must have expert-level SQL skills • Python (standard libraries) • Apache Spark • AWS • DataBricks Qualifications: Typically requires a Bachelors Degree and minimum of 10 years directly relevant experience; experience should include comprehensive experience as a business/process leader or industry expert. Note: One of the following alternatives may be accepted: - PhD or Law + 8 yrs; Masters + 9 yrs; Associates degree + 11 yrs; High School + 12 yrs.
This job posting was last updated on 10/14/2025