DOI | Trouver le DOI : https://doi.org/10.1016/j.firesaf.2022.103629 |
---|
Auteur | Rechercher : Li, Yuchuan1; Rechercher : Ko, Yoon1Identifiant ORCID : https://orcid.org/0000-0001-9644-5108; Rechercher : Lee, Wonsook |
---|
Affiliation | - Conseil national de recherches du Canada. Construction
|
---|
Bailleur de fonds | Rechercher : National Research Council |
---|
Format | Texte, Article |
---|
Description physique | 13 p. |
---|
Sujet | flashover; deep neural networks; dual-attention generative adversarial network; image processing; fire safety science |
---|
Résumé | This paper proposes a novel hybrid model for flashover prediction in a compartment fire based on visual information from RGB images that are the same as those captured by regular vision cameras. The proposed model was developed as a research tool to study the feasibility of predicting flashover based on RGB vision data. This model consists of sub-modules with data-based methods using Deep Neural Networks and knowledge-based methods using fire safety science and mathematical model. One of the crucial features of the proposed model is enabled by a novel Dual-Attention Generative Adversarial Network that is developed in this study for the vision-to-infrared conversion process. The model and the overall procedure were validated against published test data from a compartment fire. Results show that the proposed model achieved promising performance, which also shows the potential to monitor the constant changes in a room fire through continuous processing images of flame and smoke. |
---|
Date de publication | 2022-07-20 |
---|
Maison d’édition | Elsevier BV |
---|
Dans | |
---|
Langue | anglais |
---|
Publications évaluées par des pairs | Oui |
---|
Identificateur | S0379711222001072 |
---|
Exporter la notice | Exporter en format RIS |
---|
Signaler une correction | Signaler une correction (s'ouvre dans un nouvel onglet) |
---|
Identificateur de l’enregistrement | 1f7a6170-f305-4225-9c33-9a3e11d0069c |
---|
Enregistrement créé | 2022-08-03 |
---|
Enregistrement modifié | 2022-08-04 |
---|