Technology and Society / en Power and prediction: 鶹Ƶ's Avi Goldfarb on the disruptive economics of artificial intelligence /news/power-and-prediction-u-t-s-avi-goldfarb-disruptive-economics-artificial-intelligence <span class="field field--name-title field--type-string field--label-hidden">Power and prediction: 鶹Ƶ's Avi Goldfarb on the disruptive economics of artificial intelligence</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/goldfarb-power-and-prediction.jpg?h=afdc3185&amp;itok=L9SMP1wv 370w, /sites/default/files/styles/news_banner_740/public/goldfarb-power-and-prediction.jpg?h=afdc3185&amp;itok=qG9u_ezS 740w, /sites/default/files/styles/news_banner_1110/public/goldfarb-power-and-prediction.jpg?h=afdc3185&amp;itok=NMTx4X_U 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/goldfarb-power-and-prediction.jpg?h=afdc3185&amp;itok=L9SMP1wv" alt="Headshot of Avi Goldfarb and book cover"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>siddiq22</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2023-01-20T11:38:57-05:00" title="Friday, January 20, 2023 - 11:38" class="datetime">Fri, 01/20/2023 - 11:38</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Avi Goldfarb, a professor at the Rotman School of Management and research lead at the Schwartz Reisman Institute for Technology and Society, says the AI revolution is well underway – but that system-level change takes time (supplied images)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/daniel-browne" hreflang="en">Daniel Browne</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/global-lens" hreflang="en">Global Lens</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/technology-and-society" hreflang="en">Technology and Society</a></div> <div class="field__item"><a href="/news/tags/schwartz-reisman-institute-technology-and-society" hreflang="en">Schwartz Reisman Institute for Technology and Society</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/business" hreflang="en">Business</a></div> <div class="field__item"><a href="/news/tags/health" hreflang="en">Health</a></div> <div class="field__item"><a href="/news/tags/rotman-school-management" hreflang="en">Rotman School of Management</a></div> <div class="field__item"><a href="/news/tags/startups" hreflang="en">Startups</a></div> <div class="field__item"><a href="/news/tags/technology" hreflang="en">Technology</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>In the new book <a href="https://store.hbr.org/product/power-and-prediction-the-disruptive-economics-of-artificial-intelligence/10580?sku=10580E-KND-ENG"><em>Power and Prediction: The Disruptive Economics of Artificial Intelligence</em></a>, co-author&nbsp;<strong><a href="https://www.avigoldfarb.com/">Avi Goldfarb</a></strong>&nbsp;argues we live in the “Between Times”: after discovering the potential of AI, but before its widespread adoption.</p> <p>Delays in implementation are an essential part of any technology with the power to truly reshape society, says Goldfarb, a professor of marketing and the Rotman Chair in Artificial Intelligence and Healthcare at the University of Toronto's Rotman School of Management and research lead at the <a href="/news/tags/schwartz-reisman-institute-technology-and-society">Schwartz Reisman Institute for Technology and Society</a>.</p> <p>He makes the case for how AI innovation will evolve in <em>Power and Prediction</em>, his latest&nbsp;book co-authored with fellow Rotman professors <strong>Ajay Agrawal</strong> and <strong>Joshua Gans</strong>. The trio, who also wrote 2018’s <a href="https://store.hbr.org/product/prediction-machines-updated-and-expanded-the-simple-economics-of-artificial-intelligence/10598"><em>Prediction Machines: The Simple Economics of Artificial Intelligence</em></a>, are the&nbsp;co-founders of the&nbsp;<a href="https://creativedestructionlab.com/">Creative Destruction Lab</a>, a non-profit organization that helps science- and technology-based startups scale.</p> <p>Goldfarb will give a talk at the Rotman School of Management <a href="https://srinstitute.utoronto.ca/events-archive/seminar-2023-avi-goldfarb">on Jan. 25</a> as part of the SRI Seminar Series. He&nbsp;spoke with the Schwartz Reisman Institute to discuss how the evolution of AI innovation will require systems-level changes to the ways that organizations make decisions.</p> <p><em>(The&nbsp;interview has been condensed for length and clarity.)</em></p> <hr> <p><strong>What changed in your understanding of the landscape of AI innovation since your last book?</strong></p> <p>We wrote <em>Prediction Machines</em> thinking that a revolution was about to happen, and we saw that revolution happening at a handful of companies like Google, Amazon and others. But when it came to most businesses we interacted with, by 2021 we started to feel a sense of disappointment. Yes, there was all this potential, but it hadn’t affected their bottom line yet – the uses that they’d found had been incremental, rather than transformational. And that got us trying to understand what went wrong.</p> <p>One potential thing that could have gone wrong, of course, was that AI wasn’t as exciting as we thought. Another was that the technology was potentially as big a deal as the major revolutions of the past 200 years – innovations like steam, electricity, computing – and the issue was system-level implementation. For every major technological innovation, it took a long time to figure out how to make that change affect society at scale.</p> <p>The core idea of <em>Power and Prediction</em> is that AI is an exciting technology – but it’s going to take time to see its effects, because a lot of complementary innovation has to happen as well. Now, some might respond that’s not very helpful, because we don’t want to wait. And part of our agenda in the book is to accelerate the timeline of this innovation from 40 years to 10, or even less. To get there, we then need to think through what this innovation is going to look like. We can’t just say it’s going to take time – that’s not constructive.</p> <p><strong>What sort of changes are needed for organizations to harness AI’s full potential?</strong></p> <p>Here, we lean on three key ideas. The first idea is that AI today is not artificial general intelligence (AGI) – it’s prediction technology. The second is that a prediction is useful because it helps you make decisions. A prediction without a decision is useless. So, what AI really does is allow you to unbundle the prediction from the rest of the decision, and that can lead to all sorts of transformation. Finally, the third key idea is that decisions don’t happen in isolation.</p> <p>What prediction machines do is allow you to change who makes decisions and when those decisions are made. There are all sorts of examples of what seems like an automated decision, but what it actually does is take some human’s decision – typically at headquarters – and scales it. For organizations to succeed, they require a whole bunch of people working in concert. It’s not about one decision – it’s about decisions working together.</p> <p>One example is health care – at the emergency department, there is somebody on triage, who gives a prediction about the severity of what’s going on. They might send a patient immediately for tests or ask them to wait. Right now, AIs are used in triage at SickKids in Toronto and other hospitals, and they are making it more effective. But&nbsp;to really take advantage of the prediction, they need to coordinate with the next step. If triage is sending people for a particular test more frequently, then there need to be other decisions made about staffing for those tests, and where to offer them. And, if your predictions are good enough, there’s an even different decision to be made – maybe you don’t even need the tests. If your prediction that somebody’s having a heart attack is good enough, you don’t need to send them for that extra test and waste that time or money. Instead, you’ll send them direct to treatment, and that requires coordination between what’s happening upstream on the triage side and what’s happening downstream in terms of the testing or treatment side.</p> <p><img alt="avi goldfarb teaches a class " src="/sites/default/files/csm_news_28032019_01_2bb9b0f93f.jpg" style="width: 750px; height: 500px;"></p> <p><em>AI&nbsp;is as exciting a technology as electricity and computing, but it will take time to see its effects, Avi Goldfarb says.</em></p> <p><strong>Will certain sectors have greater ease in adopting system-level changes than others?</strong></p> <p>There is a real opportunity here for startups&nbsp;because when building a new system from scratch, it’s often easier to start with nothing. You don’t have to convince people to come along with your changes, so it becomes a less political process – at least within your organization. If you’re trying to change a huge established company or organization, it’s going to be harder.</p> <p>I’m very excited about the potential for AI and health care, but health care is complicated; there are so many different decision-makers. There are the patients, the payers – sometimes government, sometimes insurance companies, sometimes a combination of the above – and then there are doctors, who have certain interests, medical administrators who might have different interests, and nurses.</p> <p>AI has potential to supercharge nurses, because a key distinction between a doctor and a nurse in terms of training is diagnosis, which is a prediction problem. If AI is helping with diagnosis, that has potential to make nurses more central to how we structure the system. But that’s going to require all sorts of changes, and we have to get used to that as patients. And so, while I think the 30-year vision for what health care could look like is extraordinary, the five-year timeline is really, really hard.</p> <p><strong>What are some of the other important barriers to AI adoption?</strong></p> <p>A lot of the challenges to AI adoption come from ambiguity about what’s allowed or not in terms of regulation. In health care contexts, we are seeing lots of people trying to identify incremental point solutions that don’t require regulatory approval. We may have an AI that can replace a human in some medical process, but to do it is going to be a 10-year, multibillion-dollar process to get approval – so they’ll implement it in an app that people can use at home with a warning that it’s not real medical advice.</p> <p>The regulatory resistance to change, and the ambiguity of what’s allowed, is a real barrier. As we start thinking about system changes, there is an important role for government through legislation and regulation, as well as through its coordinating function as the country’s biggest buyer of stuff, to help push us toward new AI-based systems.</p> <p>There are also real concerns about data and bias, especially in the short term. However, in the long run, I’m very optimistic about AI to help with discrimination and bias. While a lot of the resistance to AI implementation right now is coming from people who are worried about [people who will be negatively impacted by] bias [in the data], I think that pretty soon this will flip around.</p> <p>There’s a story we discuss in the book, where Major League Baseball brought in a machine that could say whether a pitch was a strike or a ball, and the people who resisted it turned out to be the superstars. Why? Well, the best hitters tended to get favored by umpires and face smaller strike zones, and the best pitchers also tended to get favoured and had bigger strike zones. The superstars benefited from this human bias, and when they brought in a fairer system, the superstars got hurt. So, we should expect that people who currently benefit from bias are going to resist machine systems that can overcome it.</p> <p><strong>What do you look for to indicate where disruptions from AI innovation will occur?</strong></p> <p>We’re seeing this change already in a handful of industries tech is paying attention to, such as advertising. Advertising had a very <em>Mad Men</em> vibe until recently: there was a lot of seeming magic in terms of whether an ad worked, how to hire an agency and how the industry operated – a lot of charm and fancy dinners. That hasn’t completely gone away, but advertising is largely an algorithm-based industry now. The most powerful players are big tech companies – they’re no longer the historical publishers who worked on Madison Avenue. We’ve seen the disruption – it’s happened.</p> <p>Think through the mission of any industry or a company. Once you understand the mission, think through all the ways that mission is compromised because of bad prediction. Once you see where the mission doesn’t align with the ways in which an organization is actually operating, those are going to be the cases where either the organization is going to need to disrupt themselves, or someone’s going to come along and do what they do better.</p> <h3><a href="https://srinstitute.utoronto.ca/news/power-and-prediction-avi-goldfarb-on-the-disruptive-economics-of-ai">Read the full Q&amp;A at the Schwartz Reisman Institute for Technology and Society</a></h3> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Fri, 20 Jan 2023 16:38:57 +0000 siddiq22 179293 at Popularity of 'gamified' apps raises new legal issues, student researchers warn /news/popularity-gamified-apps-raises-new-legal-issues-student-researchers-warn <span class="field field--name-title field--type-string field--label-hidden">Popularity of 'gamified' apps raises new legal issues, student researchers warn</span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/GettyImages-1328853719-crop.jpg?h=afdc3185&amp;itok=00qAmVsI 370w, /sites/default/files/styles/news_banner_740/public/GettyImages-1328853719-crop.jpg?h=afdc3185&amp;itok=ZKP1M00Y 740w, /sites/default/files/styles/news_banner_1110/public/GettyImages-1328853719-crop.jpg?h=afdc3185&amp;itok=dchdDR3z 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/GettyImages-1328853719-crop.jpg?h=afdc3185&amp;itok=00qAmVsI" alt="a woman uses a dating app and is deciding which way to swipe on a picture of an east asian man"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>geoff.vendeville</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-04-05T13:26:33-04:00" title="Tuesday, April 5, 2022 - 13:26" class="datetime">Tue, 04/05/2022 - 13:26</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">(photo by AsiaVision/Getty Images)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/nina-haikara" hreflang="en">Nina Haikara</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/city-culture" hreflang="en">City &amp; Culture</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/technology-and-society" hreflang="en">Technology and Society</a></div> <div class="field__item"><a href="/news/tags/faculty-law" hreflang="en">Faculty of Law</a></div> <div class="field__item"><a href="/news/tags/hal-jackman" hreflang="en">Hal Jackman</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>With a few pushes of a button or a swipe of your fingers, you can trade stocks, hail a ride, order a pizza or find a date. But what are the implications of these gamified apps on human behaviour and the law?</p> <p>Law students in the University of Toronto's <a href="https://futureoflaw.utoronto.ca/">Future of Law Lab</a> explore these questions in a new research report that focuses on how current laws should respond to gamification – the introduction of elements of play and gaming across activities and aspects of life.&nbsp;</p> <p>“My research interest is in securities law, so gamification really captured my attention with the whole meme stock craze, and speculation about how online trading apps might have fed into it,” says Doug Sarro, a doctoral candidate at the Faculty of Law.</p> <p>“By gamifying investing, did online trading apps lead their users to trade too frequently in assets that were too risky for them? I looked around and saw gamification raises issues in other areas of law, too. For instance, when ridesharing apps use gamification to influence when and where drivers work, does this mean these drivers ought to be considered employees rather than independent contractors?</p> <p>“I thought the Future of Law Lab would be a fantastic place to explore gamification and gain a broader view of the challenges it poses to law.”&nbsp;</p> <p>The lab established a co-curricular working group led by Sarro and 16 JD students to research the implications of gamification in four areas: online trading; ridesharing and food delivery; employee productivity; and dating.</p> <p>First-year law student Nikée Allen, who has a bachelor's degree in psychology with a minor in sociology from Ryerson University, looked at how dating apps can internalize and propagate racial biases among users.&nbsp;</p> <p>“There are many – especially young – users, including vulnerable members of marginalized communities, who are being ranked by the people who are swiping on them,” Allen says. “If racialized users are being swiped on less, they are ranked lower, and they're being seen by fewer people.”</p> <p><img class="migrated-asset" src="/sites/default/files/Gamification-Report-Team-crop.jpg" alt><br> <em>The Future of Law Lab team that worked on the gamification report (photo by Nina Haikara)</em></p> <p>&nbsp;</p> <p>To better understand how the apps worked, Allen downloaded a dating app, made a profile and refused potential matches.</p> <p>“I kept track of the kind of push notifications I was receiving: ‘You would have more matches if you did this on the app.’ ‘You should buy a boost; more people will be able to see you.’”</p> <p>Allen says people who don't know how the algorithm works are being induced to engage in a particular behavior, which can have unexpected legal consequences.</p> <p>“The coercive element comes in when users are being told to use the app more. If no one is seeing them, then they must pay to increase their visibility and it’s only temporary,” Allen says. “So, you keep paying to get equal access to the same service that others can access for free.”</p> <p>The potential for legal injury will only grow as dating apps become more popular, Allen says.&nbsp;</p> <p>Fellow first-year law student <strong>Samir Reynolds</strong>, for his part, studied the design techniques of ride-sharing apps.&nbsp;</p> <p>“There’s been a lot of press coverage about whether drivers are employees, independent contractors or another classification,” says Reynolds, who has a bachelor's degree in knowledge integration, math and political science from the University of Waterloo. “However, these apps can nudge drivers to work at specific places over specific times. If we think of them as effectively telling their drivers to do that, then the drivers start to look more like employees rather than independent contractors.”</p> <p>Reynolds adds that, although consumers see prices surge when drivers are in high demand, that doesn't mean the drivers are paid more during peak hours.</p> <p>“These apps also tend to set artificial goals. Someone will be driving and after a ride is done, they will receive a notification they’re only $6 away from making $80 on their shift. But then after you make that, it will switch to only $7 away from $90.”&nbsp;</p> <p>Reynolds came to 鶹Ƶ Law after working in the technology sector for two years.&nbsp;</p> <p>“One of the defining problems of this generation of lawyers is going to be figuring out how law and policy, both from the court and from policy-makers, can evolve and adapt with technology, which is inherently going to advance faster than the law,” he says.</p> <p>The Future of Law Lab was established in 2020 – thanks to a gift by <strong>Hal Jackman</strong>, a former 鶹Ƶ chancellor and lieutenant-governor of Ontario – as a co-curricular program to bring together students, academics, lawyers and other professionals to explore the intersection of law, innovation and technology.</p> <p>"The Future of Law Lab provides students with an opportunity to learn about legal problems from a holistic perspective. Law does not exist in a vacuum and, in our context, legal problems are often business problems,” says lab director <strong>Joshua Morrison</strong>, a lawyer and graduate of the faculty’s global professional master of laws program with a concentration in innovation, law and technology.&nbsp;</p> <p>“We ask our students to consider the strategic, operational, marketing, privacy and technology aspects of a particular situation. We’re encouraging them to be innovative and solutions-oriented, while encouraging collaboration among professionals of different disciplines.”&nbsp;</p> <p>The researchers say a digital choice environment that leads people to situations where there are risks of harm should be treated the same way as a business that creates a physical choice environment that leads to the same consequences.&nbsp;</p> <p>“Law is always trying to catch up to innovation,” Sarro says. “One of the ways it keeps up is by staying flexible. And we can leverage that flexibility to respond to many of the challenges posed by gamification.”&nbsp;</p> <p>“If you're using gamification to influence people to work where you want them to work, and for how long, law can say that’s a form of control, and so we factor it in when assessing whether these workers ought to be deemed employees. If you're a business that profits from guiding people to engage in behaviour that creates risks of harm to themselves or others, law can impose a duty to mitigate those risks,” Sarro says.</p> <p>“Anti-discrimination law has a flexible set of principles that can be used to encourage apps to reflect on the effects design choices have on different groups of people and whether those design choices are helping to mitigate risks of discrimination or are amplifying those risks.”&nbsp;</p> <p>Allen sees the research project with the Future of Law Lab as good training for a career in technology law following graduation.</p> <p>&nbsp;“The Future of Law Lab is the best place to set myself up for success,” she says.&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Tue, 05 Apr 2022 17:26:33 +0000 geoff.vendeville 173989 at AI, tech and social justice: 鶹Ƶ Groundbreakers EP4 /news/ai-tech-and-social-justice-u-t-groundbreakers-ep4 <span class="field field--name-title field--type-string field--label-hidden">AI, tech and social justice: 鶹Ƶ Groundbreakers EP4</span> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2022-01-13T14:05:26-05:00" title="Thursday, January 13, 2022 - 14:05" class="datetime">Thu, 01/13/2022 - 14:05</time> </span> <div class="field field--name-field-youtube field--type-youtube field--label-hidden field__item"><figure class="youtube-container"> <iframe src="https://www.youtube.com/embed/zRV1Kt7tnno?wmode=opaque" width="450" height="315" id="youtube-field-player--3" class="youtube-field-player" title="Embedded video for AI, tech and social justice: 鶹Ƶ Groundbreakers EP4" aria-label="Embedded video for AI, tech and social justice: 鶹Ƶ Groundbreakers EP4: https://www.youtube.com/embed/zRV1Kt7tnno?wmode=opaque" frameborder="0" allowfullscreen></iframe> </figure> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/anti-black-racism" hreflang="en">Anti-Black Racism</a></div> <div class="field__item"><a href="/news/tags/groundbreakers" hreflang="en">Groundbreakers</a></div> <div class="field__item"><a href="/news/tags/institutional-strategic-initiatives" hreflang="en">Institutional Strategic Initiatives</a></div> <div class="field__item"><a href="/news/tags/technology-and-society" hreflang="en">Technology and Society</a></div> <div class="field__item"><a href="/news/tags/alumni" hreflang="en">鶹Ƶ</a></div> <div class="field__item"><a href="/news/tags/artificial-intelligence" hreflang="en">Artificial Intelligence</a></div> <div class="field__item"><a href="/news/tags/faculty-applied-science-engineering" hreflang="en">Faculty of Applied Science &amp; Engineering</a></div> <div class="field__item"><a href="/news/tags/indigenous" hreflang="en">Indigenous</a></div> <div class="field__item"><a href="/news/tags/u-t-mississauga" hreflang="en">鶹Ƶ Mississauga</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>What is the relationship between pollution and colonialism in Canada? How can AI and related technologies avoid perpetuating racism and gender bias?</p> <p>These are some of the questions explored in episode four of the <i>Groundbreakers</i> video series when host <b>Ainka Jess</b> speaks with researchers from two of the University of Toronto’s Institutional Strategic Initiatives: <a href="https://srinstitute.utoronto.ca/">the Schwartz Reisman Institute for Technology and Society</a> and <a href="https://brn.utoronto.ca/">the Black Research Network</a>.</p> <p>鶹Ƶ Mississauga’s <b>Kristen Bos</b>, an assistant professor of Indigenous science and technology and the co-director of the Technoscience Research Unit, talks about The Land and Refinery project, while 鶹Ƶ Engineering alumna <b>Deborah Raji</b> discusses how bias in AI algorithms can perpetuate racism and gender bias and erode civil rights – and how access to technology can further inclusive excellence.</p> <p>“As an Indigenous feminist and researcher, I know that the health of our lands is vital to the health of our bodies,” Bos says. “I feel like I have a responsibility. I think we all have a responsibility to hold the industries, the companies and the governments responsible for the creation of pollution and health harms.”</p> <p><i>Groundbreakers </i>is a multimedia series that <a href="/news/tags/groundbreakers">includes articles at <i>鶹Ƶ News</i></a> and features research leaders involved with 鶹Ƶ’s <a href="https://isi.utoronto.ca/">Institutional Strategic Initiatives</a>, whose work will transform lives.</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Thu, 13 Jan 2022 19:05:26 +0000 Christopher.Sorensen 301127 at 鶹Ƶ sociologist studies women's careers in 'exciting and terrifying' tech sector /news/u-t-sociologist-studies-women-s-careers-exciting-and-terrifying-tech-sector <span class="field field--name-title field--type-string field--label-hidden">鶹Ƶ sociologist studies women's careers in 'exciting and terrifying' tech sector </span> <div class="field field--name-field-featured-picture field--type-image field--label-hidden field__item"> <img loading="eager" srcset="/sites/default/files/styles/news_banner_370/public/Sharla%20Alegria_0172.jpg?h=afdc3185&amp;itok=yX6hjhvG 370w, /sites/default/files/styles/news_banner_740/public/Sharla%20Alegria_0172.jpg?h=afdc3185&amp;itok=JE_It0jj 740w, /sites/default/files/styles/news_banner_1110/public/Sharla%20Alegria_0172.jpg?h=afdc3185&amp;itok=XfieBK6L 1110w" sizes="(min-width:1200px) 1110px, (max-width: 1199px) 80vw, (max-width: 767px) 90vw, (max-width: 575px) 95vw" width="740" height="494" src="/sites/default/files/styles/news_banner_370/public/Sharla%20Alegria_0172.jpg?h=afdc3185&amp;itok=yX6hjhvG" alt="Portrait of Sharla Alegria"> </div> <span class="field field--name-uid field--type-entity-reference field--label-hidden"><span>Christopher.Sorensen</span></span> <span class="field field--name-created field--type-created field--label-hidden"><time datetime="2019-12-09T16:41:55-05:00" title="Monday, December 9, 2019 - 16:41" class="datetime">Mon, 12/09/2019 - 16:41</time> </span> <div class="clearfix text-formatted field field--name-field-cutline-long field--type-text-long field--label-above"> <div class="field__label">Cutline</div> <div class="field__item">Sharla Alegria, an assistant professor in the department of sociology who studies racial and gender inequality, has taken a particular interest in the career trajectories of women in the technology sector (photo by Diana Tyszko)</div> </div> <div class="field field--name-field-author-reporters field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/authors-reporters/jovana-jankovic" hreflang="en">Jovana Jankovic</a></div> </div> <div class="field field--name-field-topic field--type-entity-reference field--label-above"> <div class="field__label">Topic</div> <div class="field__item"><a href="/news/topics/our-community" hreflang="en">Our Community</a></div> </div> <div class="field field--name-field-story-tags field--type-entity-reference field--label-hidden field__items"> <div class="field__item"><a href="/news/tags/technology-and-society" hreflang="en">Technology and Society</a></div> <div class="field__item"><a href="/news/tags/faculty-arts-science" hreflang="en">Faculty of Arts &amp; Science</a></div> <div class="field__item"><a href="/news/tags/research-innovation" hreflang="en">Research &amp; Innovation</a></div> <div class="field__item"><a href="/news/tags/sociology" hreflang="en">Sociology</a></div> </div> <div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p><strong>Sharla Alegria</strong> is working on work.</p> <p>“I care an awful lot about work in general,” says the sociologist who joined the&nbsp;University of Toronto’s Faculty of Arts &amp; Science this fall.</p> <p>“Work is a huge part of our lives, of how we think about ourselves and compare ourselves to others. It’s also a driver of inequality because your job determines whether you can feed yourself and live a nice life.”</p> <p>Alegria’s research<span aria-label="(link is external)"></span>&nbsp;delves primarily into racial and gender inequality. Her work attempts to evaluate how, why and in what form inequalities persist – and what the implications are for workers’ lives.</p> <p>In particular, she’s taken an interest in the technology sector, studying the career trajectories of women in tech – a project detailed in&nbsp;<a href="https://journals.sagepub.com/doi/full/10.1177/0891243219835737" rel="noopener noreferrer" target="_blank">a paper published in the October issue of the journal <em>Gender and Society<span aria-label="(link is external)"></span></em></a>.</p> <p>Alegria’s research found that women with a technical background – say, in engineering or computer science – may start out in technical jobs, but eventually get promoted into managerial positions often due to the stereotype that women have better interpersonal and communications skills.</p> <p>While a promotion may seem like a good thing, Alegria says that there are still processes of exclusion at work.</p> <p>“Having women in leadership positions is important,” says Alegria, who is an assistant professor in the department of sociology.&nbsp;“But they tend to get stuck in mid-level manager positions&nbsp;–&nbsp;never going high enough to make meaningful structural differences in the company culture. Plus, they often accept these promotions to avoid dealing with technical co-workers doubting their competence in a hostile work environment.”</p> <p>Moreover, the notion that technical jobs like engineering do not require interpersonal skills is itself a grave oversight, says Alegria.</p> <p>“Engineers have to talk to people to figure out how the product being built will be used,” says Alegria. “That communicative aspect of engineering tends to get ignored or partitioned off into someone else’s job.”</p> <p>Alegria’s interest in sociology was instant when she took an introductory course early in her undergraduate studies at Vassar College, a small liberal arts school in upstate New York.</p> <p>“I realized sociology could give me a language to talk about inequality. I had never had that. It was what I wanted most in the world. And it’s still the thing I most want in the world.”</p> <p>Her work on the tech sector is part of her larger interest in so-called “knowledge work,”&nbsp;the industries and labour processes where knowledge and information are produced and shared – all of which are inextricable from technology.</p> <p>Researching a sector where there is rapid change is “exciting and terrifying,” says Alegria. “In addition to what technology does, I want to think about structural innovations in how we organize work. Sure, we can get cars to drive themselves, but what does that mean for workers?”</p> <p>Alegria joins a growing consensus that the world of work is changing – and fast. Companies are outsourcing labour to contract workers, “and that means that there are fewer full-time positions with benefits and things like that,” she says.</p> <p>While there are some advantages to tech companies creating more flexible, dynamic workplaces, “jobs are definitely more precarious,” adds Alegria.</p> <p>Asked what advice she’d give students and young workers entering the job market today, Alegria says it’s “important to collect as many skills as you can. Learn to write well, learn to use data. Learn to give a good, thoughtful presentation. These things are important in any job. And organize for labour rights – that’s a big thing for sure.”</p> <p>But most importantly, she says, “learn how to learn. The world changes so fast that you need to be able to retool and recalibrate your skills.”</p> <p>In the meantime, Alegria is conveying her enthusiasm for her research inside the classroom.</p> <p>On the first day of her third-year course on race, class and gender&nbsp;this semester, Alegria showed students&nbsp;<a href="https://commons.wikimedia.org/wiki/File:Classic_shot_of_the_ENIAC.jpg" rel="noopener noreferrer" target="_blank">a photo<span aria-label="(link is external)"></span></a>&nbsp;of technicians, both male and female, working on the first electronic general-purpose computer, the ENIAC machine.</p> <p>Originally published in <em>Popular Science</em> magazine, the photo was used by the U.S. Army as a recruitment tool in 1946, at which point the female workers were cropped out of the photo – perpetuating the stereotype that computers are “men’s work.”</p> <p>“When I showed the students, there was an audible gasp,” says Alegria.</p> <p>“We then had a great conversation about how women were literally deleted from the picture. And, of course, we still only see white workers on the machine. So, I try to get students to consider how we think about who does what kind of work.”</p> <p>Next up for Alegria: continuing to get to know 鶹Ƶ and the city she and her partner – also a sociologist – now call home.</p> <p>“鶹Ƶ has one of the top sociology programs in the world, so it’s a very exciting place to be,” she says. “I have so many new colleagues whose work I’ve been reading for years and I’m so excited that I get to connect with them. It’s an incredibly vibrant intellectual space.”</p> </div> <div class="field field--name-field-news-home-page-banner field--type-boolean field--label-above"> <div class="field__label">News home page banner</div> <div class="field__item">Off</div> </div> Mon, 09 Dec 2019 21:41:55 +0000 Christopher.Sorensen 161169 at