News from: MIMIC-III Clinical Database v1.4.
Sept. 30, 2019
The MIMIC-III database is now available on two major cloud platforms: Google Cloud Platform (GCP) and Amazon Web Services (AWS). To access the data on the cloud, simply add the relevant cloud identifier to your PhysioNet profile. Further instructions are available on the MIMIC-III website.
News from: MIMIC-CXR Database v2.0.0.
Sept. 27, 2019
The MIMIC-CXR Database has been updated to v2.0.0. The database now contains DICOM format image files and free-text radiology reports. All data has been de-identified.
Credentialed users can access the data after signing the data use agreement.
Some users may have preferred the JPG format images for convenience, despite the information loss when storing the images in this format. We plan to release a distinct project on PhysioNet which contains the same images as MIMIC-CXR but in JPG format. This distinct project will also contain structured labels derived from the radiology report. Stay tuned!
Read more: https://physionet.org/content/mimic-cxr/
Sept. 17, 2019
News from: MIMIC-CXR Database v1.0.0.
Aug. 26, 2019
Due to technical issues, the chest x-rays are currently unavailable. We apologize for the inconvenience.
July 8, 2019
- Thanks to our sponsors, a pre-conference hackathon (with prize(s)) will take place on Sunday 8th September in Singapore. Registration is required. Rules and more information can be found here .
June 23, 2019
- Two "wild card" teams were approved for inclusion in the Challenge. We are looking forward to meeting them in Singapore!
May 28, 2019
- We are accepting requests for Google Cloud Credits ($500 per eligible team). Requests are due by 29 May, 2019. Please see here for more information on how to apply for them. Requests will be processed in order and sent on to Google. We cannot guarantee you will receive them, especially later in the competition.
May 24, 2019
- By now you should have received your acceptance or rejection notices for your abstracts. If you were rejected, please do not worry - there are two more options to compete. See here and here for more information on this.
- We have added a `wild card' entry to allow one more team to enter the competition and be eligible for all the prizes. See here for more information on this.
- There will be an on-site hackathon revisiting the Challenge (with a separate award) on Sunday 8th September, before the conference begins in Singapore. Any team (whether previously registered for the Challenge or not) with at least one attendee at the conference, who turns up in person to register for the Hackathon, is eligible to enter. See here for more information on this.
May 2, 2019
- Please check the leaderboard for the current Challenge scores.
April 25, 2019
April 22, 2019
- We have made several changes from the unofficial phase of the Challenge (see below). We invite comments and questions about these changes. We plan to accept submissions again on Thursday, 25 April at 12:01 am GMT.
- We will only use our new cloud submission system for the official phase of the Challenge. See the updated instructions (here) for details. The past submission system is no longer available.
- We ask participants to write causal algorithms that make predictions using current and past (but not future) information. See the submission instructions and sample prediction code (here) for details.
- Note that you will have 10 submissions in this official phase. We will score your results on a subset of the test data. At the end of the competition we will ask you to nominate your 'best' algorithm (it need not be the one that gave you the best score) and we will run it on the full test data to provide the final test score.
- Please do not submit all ten entries in the last week of the competition. Even though we can scale the computing, failures require manual intervention and feedback. We can't do this for 1000+ entries in the final week.
- Also - we will be offering up to $500 in Google Cloud credits (courtesy of the Google Cloud Team) to the best performing entries by June, so it's worth getting a good score before then!
April 22, 2019
The new PhysioNet website is intended to simplify the process of finding and reusing data. Key improvements include:
- an updated search tool for finding data and software
- a new registration system
- a simplified process for sharing your data and software
We welcome your feedback! Please send us your comments using our feedback form.
Your ideas and suggestions will be used for continuous improvement.
Read more: https://forms.gle/WQh3jaZj53yygQJ78
April 11, 2019
- We have now implemented a cloud submission system. Instructions for the cloud submission system available here.
- The deadline of the unofficial phase of the competition has been extended to 11.59pm GMT on Sunday 14th April 2019. This will give you enough time to receive scores and, very importantly, submit abstracts to cinc.org before the 15th. Please see here for important hints on how to prepare a successful abstract, even if you don't receive a score.
- If you don't receive a score - don't panic - just submit an abstract to cinc.org by the deadline on the 15th April with some results - cross validated on the training data.
- The hiatus has now been moved, and starts on the 15th April and lasts until 12 midnight GMT on the 21st April. The Challenge re-opens at 12.01 am GMT on the 22 April.
April 8, 2019
- The entry submission system is available here.
- The deadline of the unofficial phase of the competition has been extended to 11.59pm GMT on Wednesday 10th April 2019.
April 1, 2019
- An expanded training database is now available, containing data from a total of 40,336 subjects.
Feb. 22, 2019
- Publications from the 2018 Challenge are now available.
Feb. 19, 2019
Feb. 8, 2019
The PhysioNet/Computing in Cardiology Challenge 2019 has now begun! This year's topic is prediction of sepsis from clinical data. We are delighted to announce that this year's Challenge is being sponsored by the Gordon and Betty Moore Foundation, Google and Mathworks.
Read more: https://physionet.org/challenge/2019/
Feb. 6, 2019
- This year's challenge is co-sponsored by the Gordon and Betty Moore Foundation, MathWorks, and Google.
- The PhysioNet Challenge data, examples, and scoring functions can be downloaded here.
- If you have any questions or comments regarding this Challenge, then please post them directly in our Community Discussion Forum. This will increase transparency (benefiting all the competitors) and ensure that all the Challenge organizers see your question.
Feb. 2, 2019