Mujeres, datos y poder. Una mirada al interior de la economía de las plataformas
-
1
Universidad de Deusto
info
- Gutiérrez Almazor, Miren (coord.)
ISSN: 1696-8166, 1989-9998
Año de publicación: 2023
Título del ejemplar: Women, data and power. Insights into the platform economy
Número: 42
Páginas: 13-25
Tipo: Artículo
Otras publicaciones en: Feminismo/s
Referencias bibliográficas
- Andreeva, G., y Matuszyk, A. (2018). Gender discrimination in algorithmic decision-making. 2nd International Conference on Advanced Research Methods and Analytics (CARMA2018). https://doi.org/10.4995/CARMA2018.2018.8312
- Cadwalladr, C., y Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
- Crawford, K. (2013). The Hidden Biases in Big Data. Harvard Business Review. https://hbr.org/2013/04/the-hidden-biases-in-big-data
- Data2x. (2021). Important data about women and girls is incomplete or missing. https://data2x.org/
- D'Ignazio, C., y Klein, L. F. (2019). Data feminism. MIT Press. https://doi.org/10.7551/mitpress/11805.001.0001
- D'Ignazio, C., y Klein, L. F. (2020). Seven intersectional feminist principles for equitable and actionable COVID-19 data. Big Data & Society, 7(2), 1-6. https://doi.org/10.1177/2053951720942544
- Eisenstat, Y. (2019). The Real Reason Tech Struggles with Algorithmic Bias. Wired. https://www.wired.com/story/the-real-reason-tech-struggles-with-algorithmic-bias/
- Flexer, A., Doerfler, M., Schluter, J., y Grill, T. (2018). Technical Algorithmic Bias. In A Music Recommender. 19th International Society for Music Information Retrieval Conference. https://doi.org/10.1109/ICDMW.2018.00154
- Forensic Architecture. (2020). The Killing of Zineb Redouane. https://forensic-architecture.org/investigation/the-killing-of-zineb-redouane
- Gray, J., Bounegru, L., Milan, S., y Ciuccarelli, P. (2016). Ways of seeing data: Towards a critical literacy for data visualizations as research objects and research devices. En S. Kubitschko y A. Kaun (Eds.), Innovative Methods in Media and Communication Research (pp. 290-325). Palgrave Macmillan. https://doi.org/10.1007/978-3-319-40700-5_12
- Gutiérrez, M. (2018). Data activism and social change. Palgrave Macmillan. https://doi.org/10.1007/978-3-319-78319-2
- Gutiérrez, M. (2019). Participation in a datafied environment: Questions about data literacy. Comunicação e Sociedade, 36, 29-47. https://doi.org/10.17231/comsoc.36(2019).2342
- Gutiérrez, M. (2021). Algorithmic Gender Bias and Audiovisual Data: A Research Agenda. International Journal of Communication, 15, 439-461. https://ijoc.org/index.php/ijoc/article/viewFile/14906/3333
- Hajian, S., Bonchi, F., y Castillo, C. (2016). Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining. The 22nd ACM SIGKDD International Conference, Rec. https://doi.org/10.1145/2939672.2945386
- Hao, K. (2019). This is how AI bias really happens-And why it's so hard to fix. MIT Technology Review. https://www.technologyreview.com/s/612876/this-is-how-ai-bias-really-happensand-why-its-so-hard-to-fix/
- Helmond, A. (2015). The Platformization of the Web: Making Web Data Platform Ready. Social Media + Society, 1(2), 1-11. https://doi.org/10.1177/2056305115603080
- Knight, W. (2016). How to Fix Silicon Valley's Sexist Algorithms: Computers are inheriting gender bias implanted in language data sets-And not everyone thinks we should correct it. MIT Technology Review. https://www.technologyreview.com/s/602950/how-to-fix-silicon-valleys-sexist-algorithms/
- Kõuts-Klemm, R. (2019). Data literacy among journalists: A skills-assessment based approach. Central European Journal of Communication, 3, 299-315. https://doi.org/10.19195/1899-5101.12.3(24).2
- Langston, J. (2015). Who's a CEO? Google image results can shift gender biases. University of Washington. https://www.washington.edu/news/2015/04/09/whos-a-ceo-google-image-results-can-shift-gender-biases/
- Pegg, D., y Cadwalladr, C. (2018). US data firm admits employee approached Cambridge Analytica. The Guardian. https://www.theguardian.com/uk-news/2018/mar/28/palantir-employee-cambridge-analytica
- Ramsey, L. R., y Horan, A. L. (2018). Picture this: Women's self-sexualization in photos on social media. Personality and Individual Differences, 133(15), 85-90. https://doi.org/10.1016/j.paid.2017.06.022
- Reuters. (2018). Myanmar: UN blames Facebook for spreading hatred of Rohingya. The Guardian. https://www.theguardian.com/technology/2018/mar/13/myanmar-un-blames-facebook-for-spreading-hatred-of-rohingya
- Rodríguez Martínez, M., y Gaubert, J. (2020). International Women's Day: How can algorithms be sexist? Euronews. https://www.euronews.com/2020/03/08/international-women-s-day-our-algorithms-are-sexist
- Taylor, L. (2018). As technology advances, women are left behind in digital divide. Thomson Reuters Foundation. https://www.reuters.com/article/us-britain-women-digital/as-technology-advances-women-are-left-behind-in-digital-divide-idUSKBN1K02NT
- Tolan, S. (2019). Fair and Unbiased Algorithmic Decision Making: Current State and Future Challenges (Digital Economy Working Paper) [Background paper to the European Commission's report: 'Artificial Intelligence: A European Perspective']. European Commission - Joint Research Centre. https://arxiv.org/abs/1901.04730
- Vaitla, B., Bosco, C., Alegana, V., y Wouter, E. (2017). Big Data and the Well-Being of Women and Girls Applications on the Social Scientific Frontier. Data2X. https://www.data2x.org/wp-content/uploads/2019/05/Big-Data-and-the-Well-Being-of-Women-and-Girls_.pdf
- van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 2. https://doi.org/10.24908/ss.v12i2.4776
- Wachter-Boettcher, S. (2017). Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. Norton & Company.
- Wang, E. (2018). Two dangerous visions: What does it really mean for an algorithm to be biased? The Gradient. https://thegradient.pub/ai-bias/
- Weizman, E. (2017). Forensic Architecture: Violence at The Threshold of Detectability. Zone Books. https://doi.org/10.2307/j.ctv14gphth
- Zhao, J., Wang, T., Yatskar, M., Ordonez, V., y Chang, V. (2017). Men Also Like Shopping: Reducing Gender Bias Amplification using Corpus-level Constraints. En M. Palmer, R. Hwa y S. Riedel (Eds.), Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (pp. 2979- 2989). https://doi.org/10.18653/v1/D17-1323