Feb. 10, 2020
We are delighted to announce the PhysioNet/Computing in Cardiology Challenge 2020 on Classification of 12-lead ECGs.
For more information, see the Challenge website: https://physionetchallenges.github.io/2020/
Quick links for this year's Challenge can be found here:
- Registration form: https://bit.ly/37d21iN
- Public discussion forum: https://groups.google.com/forum/#!forum/physionet-challenges
- Rules and deadlines: https://physionetchallenges.github.io/2020/#rules-and-deadlines
More information will be posted on the website linked above (and eventually mirrored on physionet.org/challenge/2020 with a delay as it is available). Please check the Challenge forum for real time updates. Please also post questions and comments in the forum. However, if your question reveals information about your entry, then please email challenge [at] physionet.org. We may post parts of our reply publicly if we feel that all Challengers should benefit from the information contained in our responses. We will not answer emails about the Challenge to any other address.
Many thanks again for your continued support of this event and we hope you enjoy this year's challenge.
- Erick, Matt, Gari and all the team at PhysioNet
Read more: https://physionetchallenges.github.io/2020/
News from: MIMIC-CXR Database v2.0.0.
Feb. 10, 2020
A journal article describing the MIMIC-CXR database was recently published in Scientific Data. The article provides detail regarding the collection, curation, and processing done in order to create the database. The article is open access and available online .
The database has also been preprocessed into compressed JPG format images, which have been made available on PhysioNet as the MIMIC-CXR-JPG Database. The database includes labels extracted from the free-text reports using publicly available tools. You can read more about the creation of this resource in our arXiv preprint .
Finally, we have created the mimic-cxr GitHub repository for collaborative code development on MIMIC-CXR . The code used to generate MIMIC-CXR-JPG from MIMIC-CXR is available in the repository already. We welcome code contributions from all users, and we encourage discussion of the data via the GitHub issues.
 Johnson AE, Pollard TJ, Berkowitz SJ, Greenbaum NR, Lungren MP, Deng CY, Mark RG, Horng S. MIMIC-CXR, a de-identified publicly available database of chest radiographs with free-text reports. Scientific Data. 2019;6.
 Johnson AE, Pollard TJ, Greenbaum NR, Lungren MP, Deng C-Y, Peng Y, Lu Z, Mark RG, Berkowitz SJ, Horng S. MIMIC-CXR-JPG, a large publicly available database of labeled chest radiographs. arXiv preprint arXiv:1901.07042. 2019.
Jan. 23, 2020
The WiDS Datathon 2020 focuses on patient health through data from MIT’s GOSSIS (Global Open Source Severity of Illness Score) initiative. Brought to you by the Global WiDS team, the West Big Data Innovation Hub, and the WiDS Datathon Committee. Winners will be announced at the WiDS Conference at Stanford University and via livestream, reaching a community of 100,000+ data enthusiasts across more than 50 countries.
Nov. 21, 2019
The MIT Laboratory for Computational Physiology is seeking a Research Software Engineer to support and undertake projects focused on improving patient care. Joining a team of data scientists and clinicians, the Engineer will help to manage the system network, write code for research studies, and initiate and develop research software.
The position would be a good fit for someone with a bachelor’s or master’s degree in a technical subject such as biomedical engineering, physics, computer science, or equivalent experience. For more details, please see the job posting on the MIT website or contact us directly if you have questions.
Nov. 5, 2019
If you have any questions or comments regarding this challenge, please post it directly in our Community Discussion Forum. This will increase transparency (benefiting all the competitors) and ensure that all the challenge organizers see your question.
Nov. 4, 2019
The official paper describing the 2019 PhysioNet Challenge will appear in Critical Care Medicine.
News from: MIMIC-III Clinical Database v1.4.
Sept. 30, 2019
The MIMIC-III database is now available on two major cloud platforms: Google Cloud Platform (GCP) and Amazon Web Services (AWS). To access the data on the cloud, simply add the relevant cloud identifier to your PhysioNet profile. Further instructions are available on the MIMIC-III website.
Sept. 27, 2019
The MIMIC-CXR Database has been updated to v2.0.0. The database now contains DICOM format image files and free-text radiology reports. All data has been de-identified.
Credentialed users can access the data after signing the data use agreement.
Some users may have preferred the JPG format images for convenience, despite the information loss when storing the images in this format. We plan to release a distinct project on PhysioNet which contains the same images as MIMIC-CXR but in JPG format. This distinct project will also contain structured labels derived from the radiology report. Stay tuned!
Read more: https://physionet.org/content/mimic-cxr/
Sept. 17, 2019
July 8, 2019
- Thanks to our sponsors, a pre-conference hackathon (with prize(s)) will take place on Sunday 8th September in Singapore. Registration is required. Rules and more information can be found here .
June 23, 2019
- Two "wild card" teams were approved for inclusion in the Challenge. We are looking forward to meeting them in Singapore!
May 28, 2019
- We are accepting requests for Google Cloud Credits ($500 per eligible team). Requests are due by 29 May, 2019. Please see here for more information on how to apply for them. Requests will be processed in order and sent on to Google. We cannot guarantee you will receive them, especially later in the competition.
May 24, 2019
- By now you should have received your acceptance or rejection notices for your abstracts. If you were rejected, please do not worry - there are two more options to compete. See here and here for more information on this.
- We have added a `wild card' entry to allow one more team to enter the competition and be eligible for all the prizes. See here for more information on this.
- There will be an on-site hackathon revisiting the Challenge (with a separate award) on Sunday 8th September, before the conference begins in Singapore. Any team (whether previously registered for the Challenge or not) with at least one attendee at the conference, who turns up in person to register for the Hackathon, is eligible to enter. See here for more information on this.
May 2, 2019
- Please check the leaderboard for the current Challenge scores.
April 25, 2019
April 22, 2019
- We have made several changes from the unofficial phase of the Challenge (see below). We invite comments and questions about these changes. We plan to accept submissions again on Thursday, 25 April at 12:01 am GMT.
- We will only use our new cloud submission system for the official phase of the Challenge. See the updated instructions (here) for details. The past submission system is no longer available.
- We ask participants to write causal algorithms that make predictions using current and past (but not future) information. See the submission instructions and sample prediction code (here) for details.
- Note that you will have 10 submissions in this official phase. We will score your results on a subset of the test data. At the end of the competition we will ask you to nominate your 'best' algorithm (it need not be the one that gave you the best score) and we will run it on the full test data to provide the final test score.
- Please do not submit all ten entries in the last week of the competition. Even though we can scale the computing, failures require manual intervention and feedback. We can't do this for 1000+ entries in the final week.
- Also - we will be offering up to $500 in Google Cloud credits (courtesy of the Google Cloud Team) to the best performing entries by June, so it's worth getting a good score before then!
April 22, 2019
The new PhysioNet website is intended to simplify the process of finding and reusing data. Key improvements include:
- an updated search tool for finding data and software
- a new registration system
- a simplified process for sharing your data and software
We welcome your feedback! Please send us your comments using our feedback form.
Your ideas and suggestions will be used for continuous improvement.
Read more: https://forms.gle/WQh3jaZj53yygQJ78
April 11, 2019
- We have now implemented a cloud submission system. Instructions for the cloud submission system available here.
- The deadline of the unofficial phase of the competition has been extended to 11.59pm GMT on Sunday 14th April 2019. This will give you enough time to receive scores and, very importantly, submit abstracts to cinc.org before the 15th. Please see here for important hints on how to prepare a successful abstract, even if you don't receive a score.
- If you don't receive a score - don't panic - just submit an abstract to cinc.org by the deadline on the 15th April with some results - cross validated on the training data.
- The hiatus has now been moved, and starts on the 15th April and lasts until 12 midnight GMT on the 21st April. The Challenge re-opens at 12.01 am GMT on the 22 April.
April 8, 2019
- The entry submission system is available here.
- The deadline of the unofficial phase of the competition has been extended to 11.59pm GMT on Wednesday 10th April 2019.
April 1, 2019
- An expanded training database is now available, containing data from a total of 40,336 subjects.