Rules

Several detection tasks based on domain knowledge were designed, which are also the fundamental tasks in vision-based structural health monitoring (SHM).Contestants are required to submit results for their prediction on test dataset.  Final results will be evaluated for both individual tasks and overall performance. For overall performance, a weighted sum will be adopted, and weights for each task will be assigned different difficulty factors.  Missing results will be counted as zero in overall performance. Please refer to “Metric, Ranking, and Awards” for more information about final scoring. Tasks are the following:

 

Classification tasks: 

  1. Easy:

  • Damage check: 2 classes (yes/no);
  • Spalling condition: 2 classes (yes/no);
  • Material type: 2 classes(steel/others)

 

  1. Medium:

  • Scene classification: 3 classes (pixel/object/structural levels);
  • Collapse check: 3 classes (no/partial collapse/collapse);
  • Component type: 4 classes (beam/column/wall/else);

 

  1. Difficult:

  • Damage level: 4 classes (no/minor/moderate/heavy damage)
  • Damage type: 4 classes (no/flexural/shear/combined)

Examples about classification tasks are in Tasks. More details about detection tasks, please refer to:

Gao, Y. and Mosalam, K. M. (2018), Deep Transfer Learning for Image‐Based Structural Damage Recognition. ComputerAided Civil and Infrastructure Engineering. 2018. doi:10.1111/mice.12363 [link]

 

Metrics

Single detection task will be evaluated based categorial accuracy of classification. A higher accuracy on test dataset indicates better performance of the team.

accuracy

Overall performance will be evaluated based on loss of points for misclassification. Therefore a higher score indicates better performance of the team.

 

Team composition

  • Participants can either be an individual or a team with no more than 4 individuals
  • Interdisciplinary teams are encouraged
  • Teams should submit applications for participation (See timeline later)

 

Competition Pools

Students/researchers/practitioners outside the Civil Engineering field are also encouraged to participate in the PHI Challenge. For example, students in Computer Science (CS) major and Data Science (DATA) major are knowledgeable and usually have sufficient background in programming and data analysis to participate in the competition. There will be two competition pools for final results evaluation and awards: CS/DATA major team and non-CS/DATA major team.

  • If at least one team member is a Computer Science or Data Science major/graduate, the team will be placed in the pool of CS/DATA.
  • If all team members are non-CS/DATA majors/graduates, the team will be placed in the pool of Non-CS/DATA.

Pool information for each team will be verified through application confirmation email.

 

Application

Application is required for the PHI Challenge. The applying team will be notified of their eligibility status after the application is reviewed by the review committee.

Teams who apply and are not determined to be eligible are also welcome to join the competition by submiting results to Kaggle, however these teams will not be considered in the final ranking and awards.

 

Ranking

Team ranking will be based on the final score of the team. Final rankings will be released within two weeks after the submittal deadline. In addition, we will adopt competition media Kaggle (https://www.kaggle.com/competitions), which can evaluate your submission at real time. Once you submit your prediction of test data through Kaggle, score and ranking will be released on the public board, which only evaluates some part of the test data. Thus it can help you evaluate and adjust your model/algorithm in real time. However, we will only allow 2 submissions per day.

 

Awards

Awards will be issued for the multiple detection tasks and two competition pools:

Top 3 overall performance for CS/DATA major team:

  • Golden medal: 1st ranking in final score
  • Silver medal: 2nd ranking in final score
  • Bronze medal: 3rd ranking in final score

Top 3 overall performance for Non-CS/DATA major team:

  • Golden medal: 1st ranking in final score
  • Silver medal: 2nd ranking in final score
  • Bronze medal: 3rd ranking in final score

Top performance in each single task:

  • 1st ranking CS/DATA major team for each individual task.
  • 1st ranking Non-CS/DATA major team for each individual task.

Best report award:

  • 1st ranking for final report

PEER_logo_text