Cutting NSF Is Like Liquidating Your Finest Investment
Look closely at your mobile phone or tablet. Touch-screen technology, speech recognition, digital sound recording and the internet were all developed using […]
Do researchers want to be engaged? Many have suggested otherwise. By and large I found the opposite. The large majority of researchers accepted my invitation to connect with practitioners.
Joni Lakin takes a look at David Lohman’s seminal 2005 work in Gifted Child Quarterly. His paper addresses the issue of underrepresentation while tackling a well-intentioned myth that nonverbal tests are the most equitable way to assess students who come from racial, ethnic, or linguistic minorities in the U.S.
Autistic individuals are estimated to be seven times more likely than the general population to come into contact with the Criminal Justice System. Dr Chloe Holloway from the University of Nottingham, is one of the finalist for Outstanding Early Career Impact in the ESRC Celebrating Impact Prize 2019.
Adam Seth Levine compares how many practitioners engaged in self-matchmaking by contacting researchers directly through the site versus the number who requested hands-on matchmaking.
As part of a project sponsored by the National Academy of Sciences and the Rita Allen Foundation, four science communications experts tackled surrounding the effective and ethical communication of science to relevant policymakers. in this webinar, we talk to the four experts about their findings and the processes they recommend.
Robert J. Marks, the director of the Walter Bradley Center for Natural and Artificial Intelligence, argues that academic reformers are battling numerical laws that govern how incentives work. His counsel? Know your enemy!
Traditionally one of the biggest obstacles to building relationships between researchers and practitioners is different time scales — nonprofits’ “focus is urgent, immediate, and often in response to events…moving quickly and loudly” whereas “academics work to a different rhythm”.
Promoting public engagement with research has become a core mission for research funders. However, the extent to which researchers can assess the impact of this engagement is often under-analysed and limited to success stories. Drawing on the example of development aid, Marco J Haenssgen argues we need to widen the parameters for assessing public engagement and begin to develop a skills base for the external evaluation of public engagement work.