Google, è ancora scontro con YouTube Vanced Manager: disinstallatela!
Xiaomi come Samsung: limita le performance di alcuni smartphone
Google e Marracash invitano i talenti musicali a partecipare a Nest Room Auditions
Per i 10 finalisti una sessione di mentorship musicale con Marracash; e per il vincitore, inoltre, la possibilità di incidere un brano inedito con il produttore Marz.
La partecipazione al concorso è consentita dal 30 marzo al 24 aprile 2022.
I dettagli sul sito , il rapper invita i talenti nascosti a partecipare a Nest Room Auditions: 10 finalisti avranno la possibilità di incontrare Marracash per una sessione esclusiva di mentorship musicale e il vincitore potrà realizzare un inedito con l’aiuto del produttore Marz (e qualche consiglio del King del Rap!) che verrà distribuito sulle principali piattaforme di streaming in Italia.
Come partecipare a Nest Room Auditions
Per partecipare al concorso è necessario:
-
Registrare un video mentre si interpreta un proprio brano inedito;
-
Includere Google Nest Hub al fine di arricchire la propria performance. Ad esempio, Google Nest Hub potrà essere utilizzato per riprodurre la propria traccia musicale oppure per impostare delle smart light colorate. O ancora – come nel video di Marracash – creare la propria “Routine personalizzata” e contemporaneamente riprodurre la propria base, creare l’atmosfera giusta con luci smart compatibili, attivare effetti scenografici con dispositivi collegati a smart plug (lampade, neon, casse, etc.) e fare tutto ciò che serve per trasformare la propria casa in un palco;
-
Caricare il video su YouTube in modalità “Pubblico”;
-
Compilare il form di iscrizione sul sito g.co/NestRoomAuditions entro il 24.04.2022.
Per suggerimenti o indicazioni su come realizzare la propria performance, si possono consultare le FAQ. I finalisti saranno annunciati il 10 maggio 2022, mentre il vincitore sarà proclamato al pubblico con il suo nuovo brano il 7 giugno 2022. Per maggiori informazioni e per partecipare al contest, visitate il sito g.co/NestRoomAuditions.
1 Concorso a premi. Regolamento completo su g.co/NestRoomAuditions.
Elon Musk come Donald Trump? Il magnate pensa a un suo social
Cavo USB-C 60W (2M): con 4€ superi i problemi di distanza
Dyson lancia le cuffie che purificano l’aria con sistema audio all’avanguardia
Miglior Router con 4 antenne: Tenda F9 600 Mbps in offerta ora su Amazon
“Lift as you lead”: Meet 2 women defining responsible AI
At Google, Marian Croak’s technical research team, The Center for Responsible AI and Human-Centered Technology, and Jen Gennai’s operations and governance team, Responsible Innovation, collaborate often on creating a fairer future for AI systems.
The teams complement each other to support computer scientists, UX researchers and designers, product managers and subject matter experts in the social sciences, human rights and civil rights. Collectively, their teams include more than 200 people around the globe focused on putting our AI Principles – Google’s ethical charter – into practice.
“The intersection of AI systems and society is a critical area of my team’s technical research,” Marian says. “Our approach includes working directly with people who use and are impacted by AI systems. Working together with Jen’s central operations team, the idea is to make AI more useful and reduce potential harm before products launch.”
For Women’s History Month, we wanted to talk to them both about this incredibly meaningful work and how they bring their lived experiences to it.
How do you define “responsible AI”?
Marian: It’s the technical realization of our AI Principles. We need to understand how AI systems are performing in respect to fairness, transparency, interpretability, robustness and privacy. When gaps occur, we fix them. We benchmark and evaluate how product teams are adopting what Jen and I call smart practices. These are trusted practices based on patterns we see across Google as we’re developing new AI applications, and the data-driven results of applying these practices over time.
Jen: There are enormous opportunities to use AI for positive impact — and the potential for harm, too. The key is ethical deployment. “Responsible AI” for me means taking deliberate steps to ensure technology works the way it’s intended to and doesn’t lead to malicious or unintended negative consequences. This involves applying the smart practices Marian mentioned through repeatable processes and a governance structure for accountability.
How do your teams work together?
Marian: They work hand in hand. My team conducts scientific research and creates open source tools like Fairness Indicators and Know Your Data. A large portion of our technical research and product work is centered in societal context and human and civil rights, so Jen’s team is integral to understanding the problems we seek to help solve.
Jen: The team I lead defines Google policies, handles day-to-day operations and central governance structure, and conducts ethical assessments. We’re made up of user researchers, social scientists, ethicists, human rights specialists, policy and privacy advisors and legal experts.
One team can’t work without the other! This complementary relationship allows many different perspectives and lived experiences to inform product design decisions. Here’s an example, which was led by women from a variety of global backgrounds: Marian’s team designed a streamlined, open source format for documenting technical details of datasets, called data cards. When researchers on the Translate team, led by product manager Romina Stella, recently developed a new dataset for studying and preventing gender bias in machine learning, members of my team, Anne P., N’Mah Y. and Reena Jana, reviewed the dataset for alignment with the AI Principles. They recommended that the Translate researchers publish a data card for details on how the dataset was created and tested. The Translate team then worked with UX designer Mahima Pushkarna on Marian’s team to create and launch the card alongside the dataset.
I’m inspired most when someone tells me I can’t do something. No matter what obstacles you face, believe you have the skills, the knowledge and the passion to make your dreams come true.
How did you end up working in this very new field?
Marian: I’ve always been drawn to hard problems. This is a very challenging area! It’s so multifaceted and constantly evolving. That excites me. It’s an honor to work with so many passionate people who care so deeply about our world and understanding how to use technology for social good.
I’ll always continue to seek out solutions to these problems because I understand the profound impact this work will have on our society and our world, especially communities underrepresented in the tech industry.
Jen: I spent many years leading User Research and User Advocacy on Google’s Trust and Safety team. An area I focused on was ML Fairness. I never thought I’d get to work on it full time. But in 2016 my leadership team wanted to have a company-wide group concentrating on worldwide positive social benefits of AI. In 2017, I joined the team that was writing and publishing the AI Principles. Today, I apply my operational knowledge to make sure that as a company, we meet the obligations we laid out in the Principles.
What advice do you have for girls and women interested in pursuing careers in responsible tech?
Marian: I’m inspired most when someone tells me I can’t do something. No matter what obstacles you face, believe you have the skills, the knowledge and the passion to make your dreams come true. Find motivation in the small moments, find motivation in those who doubt you, but most importantly, never forget to believe in the greatness of you.
Jen: Don’t limit yourself even if you don’t have a computer science degree. I don’t. I was convinced I’d work in sustainability and environmental non-profits, and now I lead a team working to make advanced technologies work better for everyone. This space requires so many different skills, whether in program management, policy, engineering, UX or business and strategy.
My mantra is “lift as you lead.” Don’t just build a network for yourself; build a supportive network to empower everyone who works with you — and those who come after you, especially those who are currently underrepresented in the tech sector. Your collective presence in this space makes a positive impact! And it’s even stronger when you build a better future together.
Go with the flow state: What music and AI have in common
Carrie Cai, Ben Zevenbergen and Johnny Soraker all work on developing artificial intelligence (AI) responsibly at Google, in the larger research community and across the technology industry. Carrie is a research scientist focusing on human-AI interaction, Ben is an ethicist and policy advisor and Johnny is an AI Principles ethicist. They all work within a global team of experts from a variety of fields, including the social sciences and humanities, focused on the ethical development of AI. They’re driven to make systems that are fair, inclusive and focused on people.
But they have more than their work in common: They’re all accomplished musicians who’ve studied music, composed and published pieces and even played at the professional level. We wanted to know more about their musical backgrounds, and how this creative interest informs their work building AI systems that take everyone into account.
What instrument — or instruments — do you play?
Ben: Guitar, bass and drums.
Johnny: Mainly drums these days, but I’ve also done ambient and electronica.
Carrie: I play piano and I also compose music.
Where did your interest in playing music come from?
Ben: I grew up in a musical family where instruments were always lying around. My parents’ friends would bring their instruments when they came to visit and our house would turn into a music venue. I enrolled in a music degree in my late teens to become a professional drummer. Then, a year later, I serendipitously became a bassist: I went to law school in the Netherlands, and the university band already had someone who was a better drummer than I was — but they needed a bassist, so I grabbed the opportunity.
Carrie: I started out in the Yamaha music program when I was six, where rather than learning technical piano playing skills you focus on ear training, hearing the music and how to play as an ensemble. I think that foundation led me to be a lot more creative with my music than I would have been otherwise. I spent part of my childhood years composing music, too — here are some of my early compositions from my high school days!
Johnny: I’ve played lots of instruments since I was a child, but never had the tenacity to get very good at any of them. Perhaps as a result of this, I got involved with a highly experimental ambient scene in the early 2000s and started the one-man project Metus Mortuus, using samples and DIY equipment to create often disturbing soundscapes. It was really only when I got hooked on the video game “Rock Band,” where you play “fake” instruments along with the notation on screen, that I put in the hours needed to get some basic limb independence and with that a platform for learning real drums.
Did you gravitate toward the drums in the game?
Johnny: No, I hardly ever touched them — I simply couldn’t make my left arm do something my right arm wasn’t doing, but one day I decided to try an experiment: Can I make these stale neural pathways of mine actually learn something new well into adulthood? I started practicing on these toy drums every day, which was painful and frustrating, but occasional breakthroughs kept me going. Eventually I achieved a level of limb independence I hadn’t thought I was capable of. I invested in proper e-drums and I’ve played almost every day since.
This [work] often requires you to think creatively. And I feel that the way in which drumming almost literally rewired my brain has made me much better at doing that.
What’s your favorite thing about playing?
Johnny: It’s really the ultimate flow experience, where you’re fully immersed in an activity to the extent you lose track of time and only focus on the present moment. There’s lots of empirical research in the field of positive psychology suggesting that regular flow experiences promote better well-being.
Ben: I love playing the bass with a band because it’s the glue between the rhythm section and the melody sections. It’s fun when you purposefully come in a beat later, you really see people not sure whether to dance or not. When you start playing, suddenly the whole audience understands what’s going on. And then they have the audacity to say they never hear the bass!
How has music made its way into your work, if at all?
Carrie: It’s certainly affected how I think about my work today, particularly around how to make AI more controllable to everyday people. For example, we’re now seeing these new, highly capable generative AI models that can compose music sounding indistinguishable from something written by Bach. But we discovered that, just because an AI model is capable of making beautiful music, doesn’t mean humans can always make beautiful music using AI.
When I create music, I’m thinking, “I want the beginning of the song to sound cheerful, then I want it to build tension, before ending in a somber way.” When I’m creating with AI, it can be difficult to express that — I can’t easily say, “Hey AI, make the first part happy and then build tension here” This can make it difficult for people to feel a sense of artistic agency and authorship when they’re creating any kind of content with AI.
Recently, I collaborated with other human-computer interaction (HCI) and machine learning (ML) researchers at Google to create new tools enabling people to steer and control AI as they compose music with it. We found that these “steering tools” significantly boost users’ sense of creative ownership and trust as they compose with AI.
Do you think there’s anything about the sort of work you do that exercises the same sort of “mental muscles” as music does?
Johnny: Yes, I think the key to ethics — and especially ethics of AI where there often is no precedent — is to be able to approach a problem from different angles and draw connections between the case at hand and relevant, similar cases from the past. This requires you to think creatively. And I feel that the way in which drumming almost literally rewired my brain has made me much better at doing that.
Ben: When you learn to play the drums, one of the hardest things is learning you must separate the movements of your limbs in your mind. It’s pretty difficult to process — which makes it a very nice experience once your mind can asynchronously control parts of your thinking to create interesting rhythms that are still in time. I think for my work on ethics of technical design, I have to frequently understand many interacting but very different disciplines. I’m not sure if it has anything to do with drumming, but I find that I can think about these things in tandem, while they are completely different.
Once when I was little, I woke up and without even changing out of my pajamas, spent the entire day composing a piece of music.
Carrie: I remember once when I was little, I woke up and without even changing out of my pajamas, spent the entire day composing a piece of music. I realize now that that was a flow state — I was working on something that was challenging yet doable. I think that’s a key property of creativity and it’s affected how I work in general. It’s easiest for me to be productive when I’m in that state — working on something that’s challenging, but not so difficult that I won’t want to start it or keep going. That’s helpful in research because there’s so much uncertainty — you never know if your experiments are going to work! But I can take a lesson from how I got into that flow state with music and apply it to research: How can I as a research scientist enter a flow state?