Ended 5 years ago
333 participants
3118 submissions

Materials (0 MB)

Download all materials

Available Data

To build solutions, the participants will have access to the historical unloading from the Ministry of Emergency Situations, as well as open datasets:

  • NCEP Reanalysis 2 — historical weather data;
  • FIRMS — data on NASA temperature anomalies;
  • ESRL PDF — NOAA Earth System Research Laboratory (ESRL) climate data.

In the process of models study and solutions preparation, the participants can use any available open data sources. However, only data from the 3rd source will have available latest data during solutions testing, including final testing in real time in order to determine the winners of the competition.

Evaluation Scheme

  1. Check phase. The solution is launched on a small set of historical data. This launch is necessary to test the solution for errors in the code and in interaction with the checking system. The participant will have full access to the output stdout / stderr and the test result.
  2. Public Test. The solution is launched on the hidden part of the historical data available only to the organizers. If the submitted solution worked incorrectly, the participant only receives the final message about the cause of the solution malfunctioning. The results on the Public test reflect the success of the solutions on closed data during the competition and are used to track progress by the participants themselves.
  3. Live Test. The final solutions of the participants are launched daily at new points of temperature anomalies for 14 days. As a result of the solution work in a real situation on the new data, based on the results, a final rating is formed, according to which the result of the competition is summarized.

Solution Format

It is necessary to send the algorithm code packed in a ZIP archive to the testing system. The solutions are launched in an isolated environment using Docker. Time and resources are limited during testing. The participant does not need to get a handle on the Docker technology.

The archive root must contain the metadata.json file with the following content:

{
  "image": "<docker image>",
  "entrypoint": "python classify_thermopoints.py $PATH_INPUT/input.csv $PATH_OUTPUT/output.csv"
}

Here, image is the field with the name of the docker image, in which the solution will be launched, entry_point is the command, whereby the solution is launched. The archive root will be the solution for the current directory.

When launching, the DATASETS_PATH environment variable contains the path to the latest open datasets, which are accessible from the solution container.

The participants are given an example of the correct baseline solution from the organizers. Any additional materials for participants are available on the github page of the competition.

Technical solutions constraints

  • Available resources: 16 Gb, 4 vCPU;
  • Time for completing the solution: 30 minutes
  • The solution does not have access to the Internet resources;
  • Maximum archive size with the solution: 5Gb;
  • Maximum size of the Docker image (publicly available): 20Gb;
  • The solution is available with latest versions of open datasets.

Cookies help us deliver our services. By using our services, you agree to our use of cookies.