2025 INFORMS Stakeholder Engagement Data Challenge Competition
The INFORMS Data Mining Society (DMS) is excited to announce the 2025 INFORMS Stakeholder Engagement Data Challenge, an exceptional opportunity for students to apply their data analytics and machine learning skills to a real-world community engagement problem. This student competition focuses on analyzing INFORMS 2024 satisfaction and engagement survey data to uncover insights that will help improve the experiences of INFORMS' diverse stakeholders – including students, academic faculty, and industry professionals.
Challenge Overview
Understanding what drives member satisfaction and active participation is crucial for professional societies like INFORMS to thrive. In this challenge, you will analyze a rich dataset from the 2024 INFORMS Stakeholder Satisfaction Survey, which collected feedback from thousands of INFORMS members and event participants across multiple sectors. The data includes membership status and history, conference attendance records, session evaluation scores, and other engagement indicators. Your goal is to dig into this data to identify patterns and predictors of satisfaction, pinpoint areas where certain groups may be underserved, and suggest actionable improvements. Your findings could directly influence how INFORMS designs future conferences, networking opportunities, and member programs. By understanding the needs of students, academics, and industry practitioners, INFORMS can implement targeted strategies to boost retention, enhance event content relevance, improve accessibility, and foster a more engaged analytics community. This challenge will not only let you sharpen your data science skills but also make a positive impact on a leading professional society.
Challenge Tasks
As a participant, you will be tasked with addressing several key analytic objectives as below:
· Identify Key Drivers: Determine the factors that most strongly influence stakeholder satisfaction, membership retention, and active participation. For example, analyze which survey variables (e.g. quality of content, networking opportunities, career stage, etc.) are highly correlated with overall satisfaction and continued involvement in INFORMS.
· Predict Engagement Outcomes: Develop predictive models for engagement-related outcomes, such as conference attendance, session ratings, or likelihood of future participation. Using historical survey and attendance data, your model should forecast behaviors (e.g. who is likely to attend the next Annual Meeting or who might give high session evaluation scores) and help understand the predictors of these outcomes.
· Detect Underserved Segments: Uncover if any stakeholder segments (for instance, industry professionals vs. academics, or undergraduate students vs. graduate students) are experiencing lower satisfaction or engagement. Identify demographic or sector groups that show engagement gaps (e.g. consistently lower event attendance or satisfaction scores) and analyze factors that might contribute to these disparities.
· Recommend Improvement Strategies: Based on your insights, propose data-driven recommendations to enhance stakeholder engagement. Focus on strategies to improve event accessibility, content relevance, mentorship opportunities, faculty-industry collaboration, and overall community engagement. Your recommendations should be practical and aimed at making INFORMS events and services more inclusive and valuable for all members.
The primary goal is to generate actionable, data-driven insights that can meaningfully assist INFORMS management in better designing conferences, engagement strategies, and prioritizing areas for improvement. Participants are encouraged to apply a variety of appropriate analytics, visualization, and modeling techniques to achieve the best possible understanding of the underlying survey data.
Deliverables
Participants are expected to submit a complete solution package that includes:
· Analytical/ML Model: A well-documented machine learning model or analytical approach that addresses the predictive tasks. This could be, for example, a regression or classification model predicting engagement outcomes, or a clustering/segmentation analysis identifying stakeholder groups. Provide any code (e.g. Python/R scripts, Jupyter notebooks) used to develop your models.
· Predictive & Analytic Outputs: Results from your analysis, such as predictions of engagement metrics or quantified importance of various drivers. This may include tables of key driver importance, model performance metrics, or any other outputs demonstrating how your approach meets the objectives.
· Insightful Visualizations: Clear and informative charts or graphs that highlight your key findings. These will be judged on clarity and how effectively they communicate insights to a broad audience.
· Final Report: A concise written report (Not more than 10 pages) summarizing your approach, findings, and recommendations. The report should describe your methodology (how you identified drivers or built predictive models), present key insights (what you discovered about satisfaction and engagement), and provide actionable recommendations for INFORMS. The report should be well-organized and written in professional English. Include your visualizations and reference any analysis results in the report to support your conclusions.
Judging Criteria
Submissions will be evaluated by a panel of judges from academia and industry. The judging criteria include:
· Quality of Insights: Depth, accuracy, and significance of the findings. Are the key drivers and patterns you identified truly meaningful and well-supported by the data?
· Innovation: Creativity and originality in your analytical approach. Did you apply novel techniques or clever ideas to derive insights or improve predictions?
· Methodological Rigor: Soundness of your methods and models. Judges will consider whether you chose appropriate modeling techniques, validated your results, and avoided methodological pitfalls.
· Clarity of Visualizations: Effectiveness of your charts/graphs in conveying insights. High-quality visuals that are easy to interpret will score better.
· Practicality of Recommendations: Feasibility and impact of the proposed strategies. Are your improvement suggestions actionable and likely to meaningfully enhance stakeholder engagement for INFORMS?
· Overall Impact: The overall potential of your solution to help INFORMS leadership make data-driven decisions.
Eligibility and Submission
· Eligibility: This competition is open to all currently enrolled undergraduate and graduate students (full-time or part-time) from any college/university. Participants may enter individually or in teams of up to 3 students. Interdisciplinary teams (mixing engineering, data science, business, etc.) are welcome.
· Registration: Teams must register their intent to participate by the registration deadline (see Timeline below). Please register using this link.
· Data Access: After registration, teams will receive access to the INFORMS 2024 Survey Results dataset (an anonymized dataset in CSV/Excel format). Participants should use this provided data for their analysis.
· Submission Format: Final submissions should include all deliverables described above. Participants will upload: 1) their code and/or model files (preferably as a well-organized Jupyter Notebook or script), 2) the visualization files (images or an embedded format in the notebook), and 3) the written report (PDF format). All these components can be packaged in a single ZIP file or submitted via the competition platform. Ensure the report contains team name and participant details.
· Submission Method: Details on how to submit will be provided to registered teams. Submissions must be received by the deadline; late submissions will not be evaluated.
Timeline
Please note the key dates for the competition (all dates in Eastern Time):
· Data Release: May 15, 2025 – Official start of the competition. Dataset and detailed guidelines become available to registered participants.
· Q&A Period: May 15 – June 15, 2025 – Participants can ask questions on the competition forum or via email. Organizers will provide clarifications and dataset support during this period.
· Submission Deadline: August 15, 2025 – All competition entries (models, visuals, report) must be submitted by 11:59 PM ET on this date.
· Evaluation Period: August 16 – August 31, 2025 – Judges review submissions based on the criteria. Participants may be contacted for any necessary clarifications during this time.
· Winners Announcement: September 5, 2025 – Results announced. Three top-performing teams will be notified via email and announced on the INFORMS website and social media.
· Awards & Recognition: October 2025 – Top three teams will be invited to present their findings at 2025 INFORMS Data Mining and Decision Analytics (DMDA) pre-conference workshop, part of the INFORMS Annual Meeting and stand a chance to win a total prize of $1,000. Additionally, all finalists will be recognized with a finalist award plaque from the INFORMS Data Mining Society.
Contact and Questions
For any questions or clarifications, feel free to reach out to the Competition Chairs:
Dr. Nathan Gaw, Assistant Professor at Air Force Institute of Technology (Email: nathan.gaw@au.af.edu)