DAT 2016 - Workshop on Data and Algorithmic Transparency (DAT'16)
Date2016-11-18
Deadline2016-09-15
VenueNew York, NY, USA - United States
Keywords
Websitehttps://datworkshop.org
Topics/Call fo Papers
The pervasiveness of data and algorithmic systems in society has generated a new class of research questions that the public is intensely interested in: Are my smart devices surreptitiously recording audio? Does my search history allow inferring intimate details that I haven’t explicitly searched for? Is the algorithm that decides my loan application biased? Do I see different prices online based on my browsing and purchase history? Are there dangerous instabilities or feedback loops in algorithmic systems ranging from finance to road traffic prediction?
Answering these questions requires empirical investigation of computer systems in the wild, with the goal of bringing transparency to these systems. Computer scientists are uniquely poised to carry out this research. The nascent literature on these topics makes clear that a combination of skills is called for: building systems to support large-scale, automated measurements; instrumenting devices to record and reverse-engineer network traffic; analyzing direct (leakage-based) and indirect (inference-based) privacy vulnerabilities; experimenting on black-box and white-box algorithmic systems; simulating and modeling these systems; machine learning; and crowdsourcing.
Computer science research today is largely siloed into disciplinary communities, none of which is well suited to tackle these interdisciplinary challenges. We call for the emergence of a new research community aimed providing transparency and ethical oversight of digital technologies through empirical measurement and analysis. We envision this research feeding into a broader effort that would include law, policy, enforcement, the press, privacy advocates and civil-liberties activists.
This new field is complementary to many existing disciplines. It draws techniques from measurement research, but investigates systems “from the outside” and is concerned with societal effects of systems rather than performance characteristics. It is also similar to security research, but the systems being studied do not have specifications of correct behavior. Finally, transparency research informs areas such as privacy-by-design and discrimination-aware data mining in creating systems that respect privacy and minimize bias. (Note that we are co-located with the FATML‘16 workshop.)
Investigating computer systems to identify effects of societal concern has the effect of holding companies’ feet to the fire. This will result in new ethical challenges. For example, is it a conflict of interest for a transparency researcher to accept industry funding? Regardless, we are confident that as the new community comes together to solve technical challenges, it will evolve an ethos and a set of norms to navigate these dilemmas as well.
Answering these questions requires empirical investigation of computer systems in the wild, with the goal of bringing transparency to these systems. Computer scientists are uniquely poised to carry out this research. The nascent literature on these topics makes clear that a combination of skills is called for: building systems to support large-scale, automated measurements; instrumenting devices to record and reverse-engineer network traffic; analyzing direct (leakage-based) and indirect (inference-based) privacy vulnerabilities; experimenting on black-box and white-box algorithmic systems; simulating and modeling these systems; machine learning; and crowdsourcing.
Computer science research today is largely siloed into disciplinary communities, none of which is well suited to tackle these interdisciplinary challenges. We call for the emergence of a new research community aimed providing transparency and ethical oversight of digital technologies through empirical measurement and analysis. We envision this research feeding into a broader effort that would include law, policy, enforcement, the press, privacy advocates and civil-liberties activists.
This new field is complementary to many existing disciplines. It draws techniques from measurement research, but investigates systems “from the outside” and is concerned with societal effects of systems rather than performance characteristics. It is also similar to security research, but the systems being studied do not have specifications of correct behavior. Finally, transparency research informs areas such as privacy-by-design and discrimination-aware data mining in creating systems that respect privacy and minimize bias. (Note that we are co-located with the FATML‘16 workshop.)
Investigating computer systems to identify effects of societal concern has the effect of holding companies’ feet to the fire. This will result in new ethical challenges. For example, is it a conflict of interest for a transparency researcher to accept industry funding? Regardless, we are confident that as the new community comes together to solve technical challenges, it will evolve an ethos and a set of norms to navigate these dilemmas as well.
Other CFPs
- 3rd Workshop on Fairness, Accountability, and Transparency in Machine Learning
- INTED2017 (11th annual Technology, Education and Development Conference)
- BIT’s 9th International Symposium of Cancer Immunotherapy
- International Conference on Civil Engineering (CiViE-2016)
- International Conference on Biomedical Engineering and Science (BES 2016)
Last modified: 2016-08-12 23:24:46