Machine Learning Optimization of Parameters for Noise Estimation
Yuyong Jeon (Inha University, Korea)
Ilkyeun Ra (University of Colorado Denver, USA)
Youngjin Park (Korea Electrotechnology Research Institute, Korea)
Sangmin Lee (Inha University, Korea)
Abstract: In this paper, a fast and effective method of parameter optimization for noise estimation is proposed for various types of noise. The proposed method is based on gradient descent, which is one of the optimization methods used in machine learning. The learning rate of gradient descent was set to a negative value for optimizing parameters for a speech quality improvement problem. The speech quality was evaluated using a suite of measures. After parameter optimization by gradient descent, the values were re-checked using a wider range to prevent convergence to a local minimum. To optimize the problem's five parameters, the overall number of operations using the proposed method was 99.99958% smaller than that using the conventional method. The extracted optimal values increased the speech quality by 1.1307%, 3.097%, 3.742%, and 3.861% on average for signal-to-noise ratios of 0, 5, 10, and 15 dB, respectively.
Keywords: gradient descent, machine learning, noise estimation, optimization
Categories: G.1.6, I.5.4