Data Processing

Both input and Ground Truth data are high dynamic range. For evaluation, all measurements are computed in tonemapped images (Modified Reinhard). The tone mapping operation can be expressed as f(x) = x / (x+0.25).
Please refer to the tone_map function in evaluate.py of scoring program.

Data Download

The training split contains 2016 pairs of 800*800*3 images. Image values are ranging from [0, 500] and constructed in '.npy' form. Validation set and testing set consist of 40 and 40 pairs of images, respectively. The input images from validation and testing set are provided and the GT are not available to participants. The validation server will be ended when the tesing phase begins.

Submission

For submitting the results, you need to follow these steps:

  1. Process the input images and keep the same name for the output image results as produced by your method (example: for an input file with name "001.npy" the output file should also be "001.npy") 
    Note that the output HDR images should be saved as '.npy' and should have the same size of the input images.
  2. The readme.txt file should contain the following lines filled in with the runtime per image (in seconds) of the solution, the number of model parameters, and 1 or 0 if employs extra data for training the models or not.
    Runtime per image [s] : 0.10 
    Parameters : 1050290
    Extra Data [1] / No Extra Data [0] : 1
    Other description : GPU: Titan Xp; Extra data: DF2K 
    The last part of the file can have any description you want about the code producing the provided results (dependencies, link, scripts, etc.)
    The provided information is very important both during the validation period when different teams can compare their results / solutions but also for establishing the final ranking of the teams and their methods.
  3. Create a ZIP archive containing all the output image results named as above and a readme.txt. Note that the archive should not include folders, all the images/files should be in the root of the archive.

We recommend you use our [scoring program] to check your submission is correct.

Final Submission

Participants will need to submit results on Codalab server. The submission format is the same as the validation phase's.

Apart from submission on Codalab server, all participants should email to the challenge organizer mipi.challenge@gmail.com to complete the final submission.

The subject of the email should be: UDC MIPI-Challenge - TEAM_NAME

The body of the mail shall include the following information:
a) the challenge name
b) team name
c) team leader's name and email address
d) rest of the team members
e) team name and user names on UDC CodaLab competitions
f) executable/source code attached or download links
g) Each team in the final testing phase should write up a factsheet to describe their solution(s) [factsheet template]
h) download link to the results of all of the test frames

Note that the executable/source code should include pre-trained models or necessary parameters so that we could run it and reproduce results. There should be a README or descriptions that explain how to execute the executable/code. Factsheet must be a compiled pdf file. Please provide a detailed explanation.