IEEE/CAA Journal of Automatica Sinica
Citation: | Hui Cao, Yajie Yu, Panpan Zhang and Yanxia Wang, "Flue Gas Monitoring System With Empirically-Trained Dictionary," IEEE/CAA J. Autom. Sinica, vol. 7, no. 2, pp. 606-616, Mar. 2020. doi: 10.1109/JAS.2019.1911642 |
[1] |
S. Lakkis, R. Younes, Y. Alayli, and M. Sawan, “Review of recent trends in gas sensing technologies and their miniaturization potential,” Sensor Review, vol. 34, no. 1, pp. 24–35, 2014. doi: 10.1108/SR-11-2012-724
|
[2] |
Z. M. Ye, “Artificial-intelligence approach for biomedical sample characterization using Raman spectroscopy,” IEEE Trans. Autom. Science and Engineering, vol. 2, no. 1, pp. 67–73, 2005.
|
[3] |
C. Feng, X. N. Gao, Y. T. Tang, and Y. S. Zhang, “Comparative life cycle environmental assessment of flue gas desulphurization technologies in China,” J. Cleaner Production, vol. 68, no. 2, pp. 81–92, 2014.
|
[4] |
C. B. Cai, L. Xu, W. Zhong, Y. Y. Tao, B. Wang, H. W. Yang, and M. Q. Wen, “Studying a gas-solid multi-component adsorption process with near-infrared process analytical technique: experimental setup, chemometrics, adsorption kinetics and mechanism,” Chemometrics and Intelligent Laboratory Systems, vol. 144, pp. 80–86, 2015.
|
[5] |
N. Sheng, Q. Liu, S. J. Qin, and T. Y. Chai, “Comprehensive monitoring of nonlinear processes based on concurrent kernel projection to latent structures,” IEEE Trans. Autom. Science and Engineering, vol. 13, no. 2, pp. 1129–1137, 2016.
|
[6] |
S. Wold, M. Sjöström, and L. Eriksson, “PLS-regression: a basic tool of chemometrics,” Chemometrics and Intelligent Laboratory Systems, vol. 58, no. 2, pp. 109–130, 2001. doi: 10.1016/S0169-7439(01)00155-1
|
[7] |
H. J. Yang and J. K. Liu, “An adaptive RBF neural network control method for a class of nonlinear systems,” IEEE/CAA J. Autom. Sinica, vol. 5, no. 2, pp. 457–462, 2018. doi: 10.1109/JAS.2017.7510820
|
[8] |
X. L. Li, W. G. Song, L. P. Lian, and X. G. Wei, “Forest fire smoke detection using back-propagation neural network based on MODIS data,” Remote Sensing, vol. 7, no. 4, pp. 4473–4498, 2015. doi: 10.3390/rs70404473
|
[9] |
B. J. de Kruif and T. J. A. de Vries, “Pruning error minimization in least squares support vector machines,” IEEE Trans. Neural Networks, vol. 14, no. 3, pp. 696–702, 2003. doi: 10.1109/TNN.2003.810597
|
[10] |
W. Y. Zhang, H. G. Zhang, J. H. Liu, K. Li, D. S. Yang, and H. Tian, “Weather prediction with multiclass support vector machines in the fault detection of photovoltaic system,” IEEE/CAA J. Autom. Sinica, vol. 4, no. 3, pp. 520–525, 2017. doi: 10.1109/JAS.2017.7510562
|
[11] |
Y. Zhou, T. B. Liu, and J. R. Li, “Rapid identification between edible oil and swill-cooked dirty oil by using a semisupervised support vector machine based on graph and nearinfrared spectroscopy,” Chemometrics and Intelligent Laboratory Systems, vol. 143, pp. 1–6, 2015. doi: 10.1016/j.chemolab.2015.02.004
|
[12] |
S. Bersimis, S. Psarakis, and J. Panaretos, “Multivariate statistical process control charts: an overview,” Quality &Reliability Engineering International, vol. 23, no. 5, pp. 517–543, 2007.
|
[13] |
C. S. Chen and J. M. Lin, “Applying rprop neural network for the prediction of the mobile station location,” Sensors, vol. 11, no. 4, pp. 4207–4230, 2011.
|
[14] |
C. C. Chang and C. J. Lin, “LIBSVM: a library for support vector machines,” ACM Trans. Intelligent Systems and Technology, vol. 2, no. 3, pp. 389–396, 2011.
|
[15] |
B. Waske, S. van der Linden, J. A. Benediktsson, A. Rabe, and P. Hostert, “Sensitivity of support vector machines to random feature selection in classification of hyperspectral data,” IEEE Trans. Geoscience and Remote Sensing, vol. 48, no. 7, pp. 2880–2889, 2010. doi: 10.1109/TGRS.2010.2041784
|
[16] |
Z. L. Cai and W. Zhu, “Feature selection for multi-label classification using neighborhood preservation,” IEEE/CAA J. Autom. Sinica, vol. 5, no. 1, pp. 320–330, 2018. doi: 10.1109/JAS.2017.7510781
|
[17] |
P.-M. Juneau, A. Garnier, and C. Duchesne, “The undecimated wavelet transform-multivariate image analysis (UWT-MIA) for simultaneous extraction of spectral and spatial information,” Chemometrics and Intelligent Laboratory Systems, vol. 142, pp. 304–318, 2015. doi: 10.1016/j.chemolab.2014.09.007
|
[18] |
Y. Zhao, X. Xu, and Y. He, “A novel hyperspectral feature-extraction algorithm based on waveform resolution for raisin classification,” Applied Spectroscopy, vol. 69, no. 12, pp. 1442–1456, 2015. doi: 10.1366/14-07617
|
[19] |
F. van, der Heijden, R. P. W. Duin, D. de Ridder, and D. M. J. Tax, “Classification, parameter estimation and state estimation,” Physical Review D Particles &Fields, vol. 80, no. 10, pp. 105-011–105-011, 2005.
|
[20] |
T. Kind, V. Tolstikov, O. Fiehn, and R. H. Weiss, “A comprehensive urinary metabolomic approach for identifying kidney cancer,” Analytical Biochemistry, vol. 363, no. 2, pp. 185–195, 2007.
|
[21] |
B. C. Kuo and D. A. Landgrebe, “Nonparametric weighted feature extraction for classification,” IEEE Trans. Geoscience and Remote Sensing, vol. 42, no. 5, pp. 1096–1105, 2004.
|
[22] |
S. Chu, S. Narayanan, and C. C. J. Kuo, “Environmental sound recognition with time-frequency audio features,” IEEE Trans. Audio Speech and Language Processing, vol. 17, no. 6, pp. 1142–1158, 2009.
|
[23] |
H. Cheng, Z. C. Liu, L. Yang, and X. W. Chen, “Sparse representation and learning in visual recognition: theory and applications,” Signal Processing, vol. 93, no. 6, pp. 1408–1425, 2013. doi: 10.1016/j.sigpro.2012.09.011
|
[24] |
E. L. Zhang, X. R. Zhang, H. Y. Liu, and L. C. Jiao, “Fast multifeature joint sparse representation for hyperspectral image classification,” IEEE Geoscience and Remote Sensing Letters, vol. 12, no. 7, pp. 1397–1401, 2015. doi: 10.1109/LGRS.2015.2402971
|
[25] |
B. A. Olshausen and D. J. Field, “Sparse coding with an overcomplete basis set: a strategy employed by V1?” Vision Research, vol. 37, no. 23, pp. 3311–3325, 1997. doi: 10.1016/S0042-6989(97)00169-7
|
[26] |
M. Aharon, M. Elad, and A. Bruckstein, “K-SVD: an algorithm for designing overcomplete dictionaries for sparse representation,” IEEE Trans. Signal Processing, vol. 54, no. 11, pp. 4311–4322, 2006. doi: 10.1109/TSP.2006.881199
|
[27] |
M. Elad, M. A. T. Figueiredo, and L. Yu, “On the role of sparse and redundant representations in image processing,” Proc. the IEEE, vol. 98, no. 6, pp. 972–982, 2010. doi: 10.1109/JPROC.2009.2037655
|
[28] |
I. Tosic and P. Frossard, “Dictionary Learning,” IEEE Signal Processing Magazine, vol. 28, no. 2, pp. 27–38, 2011. doi: 10.1109/MSP.2010.939537
|
[29] |
K. Engan, S. O. Aase, and J. H. Husoy, “Method of optimal directions for frame design,” in Proc. IEEE Int. Conf. Acoustics, Speech, and Signal Processing, 1999, vol. 5, pp. 2443–2446.
|
[30] |
K. Skretting and K. Engan, “Recursive least squares dictionary learning algorithm,” IEEE Trans. Signal Processing, vol. 58, no. 4, pp. 2121–2130, 2010. doi: 10.1109/TSP.2010.2040671
|
[31] |
A. Gobrecht, R. Bendoula, J.-M. Roger, and V. Bellon-Maurel, “Combining linear polarization spectroscopy and the representative layer theory to measure the beer–lambert law absorbance of highly scattering materials,” Analytica Chimica Acta, vol. 853, pp. 486–494, 2015. doi: 10.1016/j.aca.2014.10.014
|
[32] |
P.-N. Tan, M. Steinbach, A. Karpatne, and V. Kumar, Introduction to Data Mining. Pearson Education India, 2006.
|
[33] |
N. J. Higham, Accuracy and Stability of Numerical Algorithms. Siam, vol. 80, 2002.
|
[34] |
T. Naes, T. Isaksson, and B. Kowalski, “Locally weighted regression and scatter correction for near-infrared reflectance data,” Analytical Chemistry, vol. 62, no. 7, pp. 664–673, 1990. doi: 10.1021/ac00206a003
|