Home

The INFORMS Computing Society (ICS) addresses the interface of O.R. and computing. Since their earliest days, O.R. and computing have been tightly linked. The practice of O.R. depends heavily on the availability of software and systems capable of solving industrial-scale problems: computing is the heart of O.R. in application.

ICS is INFORMS' leading edge for computation and technology. Major ICS interests are algorithms and software for modeling, optimization, and simulation. ICS is also interested in the leading edge of computing and how it affects O.R. (e.g. XML modeling standards, O.R. services offered over the web, open source software, constraint programming, massively parallel computing, high performance computing).

Latest Discussions

There are no discussions to which you have access

Either the content you're seeking doesn't exist or it requires proper authentication before viewing.

Recent Shared Files

No Data Found

Either the content you're seeking doesn't exist or it requires proper authentication before viewing.

Jobs of Interest to ICS

  • Chattanooga, Tennessee, About the Position The Sr Data Engineer will develop and support next generation architecture for the advanced supply chain analytics offering, covering both Data Engineering and Business Intelligence (BI). The job will design, create, manage, and use of large datasets across a variety of data platforms. The position crafts, implements, and operates stable, scalable, low-cost solutions to replicate data from production systems into the BI data store.    Functions Partner with key distribution, transportation, and material handling business managers to design and implement efficient and scalable Extract, Transform, Load (ETL) solutions Assist in developing and improving the current ETL/BI architecture, emphasizing data security, data quality and timeliness, scalability, and extensibility Design low latency data architectures at scale to enable data driven decision-making Delivery of performant and scalable data products and platforms, both Online Transactional Processing (OLTP) and Online Analytical Processing (OLAP), to consume, process, analyze and present data from the data ecosystem, while driving data engineering best practices Ensure the integrity and uninterrupted processing of batch and real-time data pipelines to ingest and process data from various internal data sources and third-party platforms Work closely with Data Science and Business Intelligence team to develop and maintain company-level Key Performance Indicators (KPIs), accompanying core metrics and develop architecture to drive business growth Qualifications Bachelor’s degree in Computer Science.  Master’s or PhD degrees preferred 8-10 years Data Engineering experience: proven track record working with large datasets and closely working with data analysts, software and infrastructure engineers Expert level skills writing/optimizing complex SQL and schema buildout with structured and unstructured data, including star schemas, constellations, and snowflake schemas Proficiency with languages like Python, Ruby, or Java Strong background in working with Big Data technologies like: Spark, Kafka, Hadoop, etc. Experienced in developing and delivering on technical roadmaps and architectures for platforms or cross-functional problems that impact multiple teams Experience in creating data warehousing, data lakes, and data marts Experience in data mining, profiling, and analysis Experience with workflow orchestration tools and rules-driven process automation engine systems Good understanding of Continuous Integration (CI)/ Continuous Delivery (CD) principles Demonstrated track record of dealing with ambiguity, prioritizing needs, and delivering results in a dynamic business environment Proven ability to develop unconventional solutions; Sees opportunities to innovate and leads the way Experienced in designing data solutions utilizing AWS cloud data platforms and tools Experience in creating scalable architecture to enable development of data driven products preferred Experience in creating/supporting decision support dashboard/product preferred Experience in mentoring BI or ETL team preferred Competencies Business Acumen - Knowledgeable in current and possible future policies, practices, trends, technology, and information affecting his/her business and organization. Communicate for Impact - Proactively communicate with all stakeholders throughout the life cycle of programs and projects. Influencing Others - Can quickly find common ground and can solve problems for the good of the organization with a minimal amount of noise. Authentically gains trust and support of peers. Managing Transitions/ Change Management - Effectively plans, manages and communicates changes in processes with appropriate stakeholders. Strategic Agility - Enable Kenco to remain competitive by adjusting and adapting to innovative ideas necessary to support Kenco’s long-term organizational strategy.   Travel Requirements This position is expected to travel approximately 25% or less A passport is not required, but recommended. Bachelor’s degree in Computer Science.  Master’s or PhD degrees preferred 8-10 years Data Engineering experience: proven track record working with large datasets and closely working with data analysts, software and infrastructure engineers Expert level skills writing/optimizing complex SQL and schema buildout with structured and unstructured data, including star schemas, constellations, and snowflake schemas Proficiency with languages like Python, Ruby, or Java Strong background in working with Big Data technologies like: Spark, Kafka, Hadoop, etc. Experienced in developing and delivering on technical roadmaps and architectures for platforms or cross-functional problems that impact multiple teams Experience in creating data warehousing, data lakes, and data marts Experience in data mining, profiling, and analysis Experience with workflow orchestration tools and rules-driven process automation engine systems Good understanding of Continuous Integration (CI)/ Continuous Delivery (CD) principles Demonstrated track record of dealing with ambiguity, prioritizing needs, and delivering results in a dynamic business environment Proven ability to develop unconventional solutions; Sees opportunities to innovate and leads the way Experienced in designing data solutions utilizing AWS cloud data platforms and tools Experience in creating scalable architecture to enable development of data driven products preferred Experience in creating/supporting decision support dashboard/product preferred Experience in mentoring BI or ETL team preferred
  • Owings Mills, Maryland, Role Summary Reporting to the Sr. Manager of Workforce Planning and Management (WPM), the WPM Senior Data Analyst will play a vital role in enhancing the analytical functions within the WPM organization.  This position's primary focus will be to build and enhance WPM solutions by leveraging their skills in statistics, machine learning, database management and visualization.  Creating and maintaining these solutions will improve WPM's ability to manage capacity and resources for the respective business units it supports.  The individual will not only leverage their technical and analytical experience to support the team, but also use their experiences working within teams to improve the overall speed and effectiveness of WPM's operating model. Responsibilities: Analytics, Modeling & Forecasting The primary objective of this role will be to unlock new insights and develop new solutions by leveraging statistical analyses and machine learning techniques. This role will focus on improving WPM’s current forecasting capability and developing tools that improve staff level recommendations. This role will also spend time performing various ad hoc analyses to deliver insights to WPM stakeholders and internal WPM analysts as opportunities occur.  Visualizations & Business Unit Communication This position will have a strategic role in developing new visualizations that increase transparency and provide new business insights pertaining to capacity management.  This position will be responsible for developing new dashboards and KPIs that report on historical, current and forecasted performance, including business metrics and model forecasts.  The individual will also actively work with their WPM peers to communicate business line performance to WPM’s business partners. Data Management & Architecture This role will have a strategic role in centralizing and stream-lining the flow of data from the various systems and databases WPM uses to support the business.  Improving the data storage capabilities within WPM will provide the necessary foundation to improve all analytical, forecasting and reporting functions the individual will need to support. This role will also help transition WPM's data from Microsoft based tools to enterprise supported databases and warehouses.   Commitment to Diversity, Equity, and Inclusion: We strive for equity, equality, and opportunity for all associates. When we embrace the power of diversity and create an environment where people can bring their authentic and best selves to work, our firm is stronger, and we create greater value for our clients. Our commitment and inclusive programming aim to lift the experience for each associate and builds allies for our global associate community. We know that a sense of belonging is key not only to your success at the firm, but also to your ability to bring your best each day.   Benefits: We invest in our people through a wide range of programs and benefits, including: Competitive pay and bonuses as well as a generous retirement plan and employee stock purchase plan with matching contributions Flexible and remote work opportunities Health care benefits (medical, dental, vision) Tuition assistance Wellness programs (fitness reimbursement, Employee Assistance Program) Our policies may change as our working lives evolve. Yet, our commitment to supporting our associates’ well-being and addressing the needs of our clients, business, and communities is unwavering. T. Rowe Price is an equal opportunity employer and values diversity of thought, gender, and race. We believe our continued success depends upon the equal treatment of all associates and applicants for employment without discrimination on the basis of race, religion, creed, colour, national origin, sex, gender, age, mental or physical disability, marital status, sexual orientation, gender identity or expression, citizenship status, military or veteran status, pregnancy, or any other classification protected by country, federal, state, or local law. Qualifications: Required: Bachelor's degree or the equivalent combination of education and relevant experience 5+ years of total relevant work experience in Data Science, Operations Management, Operations Research, Statistics, Computer Science, Information Systems or related fields Knowledge of advanced statistical concepts to provide new insights to internal and external stakeholders Demonstrated proficiency creating and using advanced machine learning algorithms and statistics like regressions, simulations, scenario analysis, modeling, clustering, decision trees, and neural networks Proficiency with data, particularly querying data with SQL and managing data in spreadsheets or databases Experience using object-oriented programming languages like Python to develop models, perform analytical tasks and to automate processes Strong analytical foundation, sound judgment, and ability to solve problems independently Demonstrated ability to leverage technology to implement efficient, robust and cross-functional solutions Demonstrated ability to troubleshoot and resolve data-related technical issues Experience with visualization software like Tableau, Spotfire or BI to communicate business performance and status Strong written and verbal communication skills Excellent time management skills and ability to prioritize multiple tasks Proven ability to persuade peers, inspire associates, clients and vendors to comply with current policies and procedures Preferred: Experience using tools, like Atlassian, that foster collaboration, development, and version control for programming solutions Experience with Amazon Web Services (AWS) and their ecosystem of solutions like EC2, RDS and Redshift Experience working with unstructured data leveraging noSQL solutions Experience working with Data Warehouses, particularly Snowflake Knowledge on how to leverage tools like Hadoop and Spark to implement big data solutions Experience leveraging data from 3rd party providers like Google Analytics Demonstrated leadership skills and experience Excellent customer service, interpersonal, and communication skills Proactive in seeking out process improvement to reduce expense or improve efficiency
  • Dallas, Texas, Senior Associate, Artificial Intelligence (Multiple Positions), PricewaterhouseCoopers Advisory Services LLC, Dallas, TX. Create automation pipelines to help clients decrease routine labor inputs and reduce manual hours. Assist clients in process improvement, transformation, the effective use of technology and data & analytics and leveraging alternative delivery as key areas to drive value. Determine and advise clients on the best model for their automation needs after an investigation into their business practices and current needs. Help clients identify and prioritize emerging technologies to get the most out of their investments. Apply knowledge of local, national, and international technology, business and economic issues, especially as they pertain to the use of AI or data science solutions.   40 hrs/week, Mon-Fri, 8:30 a.m. - 5:30 p.m.   MINIMUM REQUIREMENTS:   Must have a Bachelor's degree or foreign equivalent in Business Administration, Mathematics, Management Information Systems, Computer Science, or a related field, plus 3 years of related work experience.   In the alternative, the employer will accept a Master’s degree or foreign equivalent in Business Administration, Mathematics, Management Information Systems, Computer Science, or a related field, plus 1 year of related work experience.   Must have at least one year of experience with each of the following:   - Building machine learning models, data pipelines, and autonomous systems, interpreting their output, and communicating the results to a non-technical audience; - Performing DevOps/engineering tasks in publishing and deploying Al assets in live production environments suitable for large scale adoption; and, - Creating standard error logging protocol for existing apps on workbench.   Travel up to 80% is required.   Please apply by mail, referencing Job Code TX3376, Attn: HR SSC/Talent Management, 4040 West Boy Scout Boulevard, Tampa, FL 33607.

INFORMS Journal on Computing Articles

MOST ACTIVE MEMBERS