Mostrar el registro sencillo del ítem
Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks
dc.contributor.author | Candia García, Cristian | |
dc.contributor.author | Forero, Manuel G. | |
dc.contributor.author | Herrera Rivera, Sergio | |
dc.date.accessioned | 2021-06-25T20:50:44Z | |
dc.date.accessioned | 2021-10-01T17:37:41Z | |
dc.date.available | 2021-06-25T20:50:44Z | |
dc.date.available | 2021-10-01T17:37:41Z | |
dc.date.issued | 2019 | |
dc.identifier.isbn | 978-3-030-13468-6 | |
dc.identifier.isbn | 978-3-030-13469-3 | |
dc.identifier.uri | https://repositorio.escuelaing.edu.co/handle/001/1605 | |
dc.description.abstract | When modeling phenomena that cannot be studied by deterministic analytical approaches, one of the main tasks is to generate random variates. The widely-used techniques, such as the inverse transformation, convolution, and rejection-acceptance methods, involve a significant amount of statistical work and do not provide satisfactory results when the data do not conform to the known probability density functions. This study aims to propose an alternative nonparametric method for generating random variables that combines kernel density estimation (KDE), and radial basis function based neural networks (RBFBNNs). We evaluate the method’s performance using Poisson, triangular, and exponential probability density distributions and assessed its utility for unknown distributions. The results show that the model’s effectiveness depends substantially on selecting an appropriate bandwidth value for KDE and a certain minimum number of data points to train the algorithm. the proposed method enabled us to achieve an R2 value between 0.91 and 0.99 for analyzed distributions. | eng |
dc.description.abstract | Cuando se modelan fenómenos que no se pueden estudiar mediante enfoques analíticos deterministas, una de las principales tareas es generar variables aleatorias. Las técnicas ampliamente utilizadas, como los métodos de transformación inversa, convolución y rechazo-aceptación, involucran una cantidad significativa de trabajo estadístico y no brindan resultados satisfactorios cuando los datos no se ajustan a las funciones de densidad de probabilidad conocidas. Este estudio tiene como objetivo proponer un método no paramétrico alternativo para generar variables aleatorias que combine la estimación de la densidad del kernel (KDE) y las redes neuronales basadas en funciones de base radial (RBFBNN). Evaluamos el rendimiento del método usando distribuciones de densidad de probabilidad Poisson, triangular y exponencial y evaluamos su utilidad para distribuciones desconocidas. Los resultados muestran que la efectividad del modelo depende sustancialmente de seleccionar un valor de ancho de banda apropiado para KDE y un cierto número mínimo de puntos de datos para entrenar el algoritmo. el método propuesto nos permitió alcanzar un valor de R2 entre 0,91 y 0,99 para las distribuciones analizadas. | spa |
dc.format.extent | 8 páginas | spa |
dc.format.mimetype | application/pdf | spa |
dc.language.iso | eng | spa |
dc.publisher | Springer Nature | spa |
dc.relation.ispartofseries | LNCS;11401 | |
dc.source | https://link.springer.com/chapter/10.1007%2F978-3-030-13469-3_29 | spa |
dc.title | Generating Random Variates via Kernel Density Estimation and Radial Basis Function Based Neural Networks | eng |
dc.type | Capítulo - Parte de Libro | spa |
dc.type.version | info:eu-repo/semantics/publishedVersion | spa |
oaire.accessrights | http://purl.org/coar/access_right/c_16ec | spa |
oaire.version | http://purl.org/coar/version/c_970fb48d4fbd8a85 | spa |
dc.identifier.doi | 10.1007/978-3-030-13469-3_29 | |
dc.identifier.url | https://doi.org/10.1007/978-3-030-13469-3_29 | |
dc.publisher.place | Suiza | spa |
dc.relation.citationedition | Iberoamerican Congress on Pattern Recognition CIARP 2018: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications pp 245-252 | spa |
dc.relation.citationendpage | 252 | spa |
dc.relation.citationstartpage | 245 | spa |
dc.relation.indexed | N/A | spa |
dc.relation.ispartofbook | Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications | spa |
dc.relation.references | Banks, J. (ed.): Handbook of Simulation: Principles, Methodology, Advances, Applications, and Practice, 1 edn. Wiley-Interscience, New York/Norcross (1998) | spa |
dc.relation.references | Krishnamoorthy, K.: Handbook of Statistical Distributions with Applications, 2nd edn. CRC Press, Boca Raton (2016) | spa |
dc.relation.references | Svensson, C., Hannaford, J., Prosdocimi, I.: Statistical distributions for monthly aggregations of precipitation and streamflow in drought indicator applications. Water Resour. Res. 53(2), 999–1018 (2017) | spa |
dc.relation.references | Gerber, M.S.: Predicting crime using Twitter and kernel density estimation. Decis. Support Syst. 61(Suppl. C), 115–125 (2014) | spa |
dc.relation.references | Zipkin, E.F., Leirness, J.B., Kinlan, B.P., O’Connell, A.F., Silverman, E.D.: Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation. Stat. Methodol. 17(Suppl. C), 67–81 (2014) | spa |
dc.relation.references | Berkson, J.: Some difficulties of interpretation encountered in the application of the chi-square test. J. Am. Stat. Assoc. 33(203), 526–536 (1938) | spa |
dc.relation.references | Massey, F.J.: The Kolmogorov-Smirnov test for goodness of fit. J. Am. Stat. Assoc. 46(253), 68–78 (1951) | spa |
dc.relation.references | Gutiérrez, M., Agustín, P., Gómez-Restrepo, C.: Beyond p value. Rev. Colomb. Psiquiatr. 38(3), 574–586 (2009) | spa |
dc.relation.references | Rosenblatt, M.: Remarks on some nonparametric estimates of a density function. Ann. Math. Stat. 27(3), 832–837 (1956) | spa |
dc.relation.references | Silverman, B.W.: Algorithm AS 176: kernel density estimation using the fast fourier transform. J. R. Stat. Soc. Ser. C Appl. Stat. 31(1), 93–99 (1982) | spa |
dc.relation.references | Heidenreich, N.-B., Schindler, A., Sperlich, S.: Bandwidth selection for kernel density estimation: a review of fully automatic selectors. AStA Adv. Stat. Anal. 97(4), 403–433 (2013) | spa |
dc.relation.references | Agarwal, R., Chen, Z., Sarma, S.V.: A novel nonparametric maximum likelihood estimator for probability density functions. IEEE Trans. Pattern Anal. Mach. Intell. 39(7), 1294–1308 (2017) | spa |
dc.relation.references | Padilla, O.H.M., Scott, J.G.: Nonparametric density estimation by histogram trend filtering. arXiv:150904348 Stat, September 2015 | spa |
dc.relation.references | Xu, X., Yan, Z., Xu, S.: Estimating wind speed probability distribution by diffusion-based kernel density method. Electr. Power Syst. Res. 121, 28–37 (2015) | spa |
dc.relation.references | Arora, S., Taylor, J.W.: Forecasting electricity smart meter data using conditional kernel density estimation. Omega 59(Part A), 47–59 (2016) | spa |
dc.relation.references | Barabesi, L., Pratelli, L.: Universal methods for generating random variables with a given characteristic function. J. Stat. Comput. Simul. 85(8), 1679–1691 (2015) | spa |
dc.relation.references | Magdon-Ismail, M., Atiya, A.: Density estimation and random variate generation using multilayer networks. IEEE Trans. Neural Netw. 13(3), 497–520 (2002) | spa |
dc.relation.references | Alzaatreh, A., Lee, C., Famoye, F.: A new method for generating families of continuous distributions. METRON 71(1), 63–79 (2013) | spa |
dc.relation.references | Bringmann, K., Friedrich, T.: Exact and efficient generation of geometric random variates and random graphs. In: Fomin, F.V., Freivalds, R., Kwiatkowska, M., Peleg, D. (eds.) ICALP 2013. LNCS, vol. 7965, pp. 267–278. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39206-1_23 | spa |
dc.relation.references | Specht, D.F.: Probabilistic neural networks. Neural Netw. 3(1), 109–118 (1990) | spa |
dc.relation.references | Specht, D.F.: A general regression neural network. IEEE Trans. Neural Netw. 2(6), 568–576 (1991) | spa |
dc.relation.references | Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2012) | spa |
dc.rights.accessrights | info:eu-repo/semantics/closedAccess | spa |
dc.subject.armarc | Variable aleatoria | spa |
dc.subject.armarc | Distribución de probabilidad | spa |
dc.subject.armarc | Teoría de las distribuciones (Análisis funcional) | spa |
dc.subject.armarc | Theory of distributions (Functional analysis) | eng |
dc.subject.armarc | Funciones Kernel | spa |
dc.subject.armarc | Kernel functions | eng |
dc.subject.proposal | General regression neural network | eng |
dc.subject.proposal | Probabilistic neural network | eng |
dc.subject.proposal | Kernel density estimation | eng |
dc.subject.proposal | Random variable | eng |
dc.subject.proposal | Probability distribution | eng |
dc.type.coar | http://purl.org/coar/resource_type/c_2df8fbb1 | spa |
dc.type.content | Text | spa |
dc.type.driver | info:eu-repo/semantics/bookPart | spa |
dc.type.redcol | http://purl.org/redcol/resource_type/ART | spa |
Ficheros en el ítem
Este ítem aparece en la(s) siguiente(s) colección(ones)
-
AI - Centro de Investigación en Manufactura y Servicios – CIMSER [49]
Clasificación: B- Convocatoria 2018.