|
|
|
🗞️Diversity and inclusion news🗞️ |
|
|
|
♿ DEI v the law😮
The US Equal Employment Opportunity Commission (EEOC) has filed its first major lawsuit against a workplace diversity initiative under the current Trump administration — and, somewhat incredibly, the target isn’t hiring quotas, promotions, or pay equity.
It’s a networking event.
A Coca-Cola bottler is being sued after hosting a two-day leadership retreat exclusively for female employees, complete with speakers, team-building, and hotel stays covered by the company. Men weren’t invited. The EEOC argues that alone may constitute unlawful sex discrimination.
Yes — we have officially reached the phase of history where networking events have entered the culture wars.
On paper, this looks like a niche legal dispute. In reality, it’s a signal that DEI is entering its next evolution — one shaped less by corporate pledges and more by legal precision.
And oddly enough, that may not be entirely bad news.
For the past decade, many diversity programmes operated under a shared assumption: if an initiative aimed to support historically underrepresented groups, its intent largely justified its structure.
That assumption is now being stress-tested.
The EEOC’s argument is deceptively simple:
If participation in an employer-sponsored opportunity carries professional benefit, excluding a protected class — including men — may violate discrimination law.
In other words: good intentions are no longer enough; programme design matters just as much.
Cue collective HR anxiety.
But here’s the more optimistic read: this moment is less about dismantling inclusion and more about forcing organisations to mature beyond performative DEI into durable, well-engineered inclusion.
Because let’s be honest — some workplace DEI initiatives were built quickly in response to social pressure, not always with long-term legal or structural thinking in mind. The era of “launch first, legal review later” was probably never going to last forever.
This case reframes DEI from a purely cultural or values conversation into something organisations must design as carefully as compensation, governance, or risk management.
And companies are paying attention — not because inclusion is ending, but because it’s becoming infrastructure.
|
|
|
|
|
|
🧠Things that make you go hmmm🧠 |
|
|
|
📱 Meta’s Research Says Parental Controls Don’t Really Work — Awkward.
In testimony from a major US social media addiction trial, internal Meta research revealed something quietly explosive: parental supervision and screen-time controls appear to have little impact on teens’ compulsive social media use. Yes — the very tools parents are told will fix the problem may not actually fix the problem. 😬
The study, known internally as Project MYST, found no meaningful link between parental monitoring and whether teenagers could regulate their own social media use. In other words, limiting screen time or setting app rules didn’t significantly change behaviour — a finding that complicates years of “it’s the parents’ responsibility” narratives. 📊
Naturally, this surfaced during a lawsuit accusing Meta and other platforms of designing addictive products that contributed to anxiety, depression, and self-harm among young users. And suddenly, what looked like a parenting debate starts to look a lot more like a product design debate. ⚖️
🧠 The Algorithm vs The Parent
The uncomfortable implication of the research is simple: you can’t out-parent an algorithm optimised for engagement. 🤖
The lawsuit argues platforms rely on dopamine-driven design — endless feeds, variable rewards, and notifications engineered to keep users scrolling. Meta, meanwhile, prefers the term “problematic use,” which is corporate shorthand for “people spending more time here than they’d like, but not technically addicted.” 🌀
Even Instagram head Adam Mosseri appeared unfamiliar with details of the study during testimony, despite documents suggesting leadership approval — proving once again that internal research becomes much harder to remember once lawyers enter the room. 🧾
So what? This isn’t just about teens and Instagram — it’s a preview of the AI era. As feeds become more personalised and agentic AI systems learn emotional patterns in real time, vulnerability-aware design will become a governance issue, not a UX choice. Today’s lawsuits about scrolling behaviour could become tomorrow’s rules about emotionally adaptive AI systems.
|
|
|
|
🔞 The UK Just Fined a Porn Company £1.35m — And This Is Really About the Future of the Internet
Ofcom has issued its largest fine yet under the Online Safety Act, hitting a porn company with a £1.35m penalty for failing to properly verify users’ ages. Translation: UK internet regulation has officially moved from policy papers to penalties — and enforcement has entered the chat. 💸
The company, 8579 LLC, failed to introduce “highly effective” age checks after new rules came into force in July 2025, allowing children to access adult content. It was also fined £50,000 for ignoring Ofcom’s information requests — never a strong opening move when regulators are looking for examples. 📩
This isn’t really a story about porn. It’s about regulators proving that compliance is no longer optional — and that the Online Safety Act is real, operational, and expensive to ignore. 🌐
For years, online safety lived mostly in consultations and conference panels. Now the invoices are arriving. 🧾
The law requires platforms to implement genuinely robust age assurance — not “Are you over 18? Yes/No” popups. Ofcom acted within days of the rules taking effect, signalling a clear shift: deadlines matter, enforcement is early, and “wait and see” is no longer a viable strategy. ⏱️
The message to platforms is simple: safety obligations are now infrastructure, not PR. ⚖️
🍿 Remember the Pornhub Story?
If this sounds familiar, it should. Earlier this month Pornhub restricted UK access, arguing the law could push users toward darker parts of the web — a tension we covered recently in the newsletter. 🔞
Together, these moments show regulation’s new reality: platforms warn about unintended consequences, regulators prioritise child safety, and the internet is forced to adapt somewhere in the middle. 🪙
The key shift? Online safety is moving from voluntary moderation to enforceable systems. 🏗️
🇬🇧 Why This Matters Beyond Adult Sites
Adult platforms are just the first test case. The same Online Safety Act framework applies to harmful content involving self-harm, eating disorders, recommendation algorithms, and increasingly AI-generated material. 🤖
In practice, this is a blueprint for how governments plan to regulate digital risk more broadly — and why safeguarding, wellbeing, and inclusion conversations are starting to overlap in ways tech companies can’t ignore. 🧠
So what? The internet is entering its compliance era. After years of platforms moving fast and breaking things, regulators have decided to slow down and invoice things instead. The result isn’t the death of the internet — it’s the arrival of adult supervision. Slightly less chaotic, marginally more paperwork, and significantly fewer “just trust us” product launches. 🧾
|
|
|
|
|
|
📈 The tools behind the tech📉
📦Product📦
📏Design📏
👩🏿💻Code👩🏿💻
🏢The business behind the tech🏢
|
|
|
|
|
|
🛍️Tech deal of the week🛍️ |
 |
|
|
|
All image credits to Amazon,
Thinking about jetting off to get some winter sun, You, may need one of these for under 20 quid
Link here and check out our other deals too
And view our shop with our whole collection here
|
|
|
|
|
|
😅Meme/AI video of the week 😅 (the internet can be savage lol) |
 |
|
|
|
🌐Partner Events & Opportunties 🌐 |
|
Below are the top opportunities we want to highlight to you this week! If you want to see more, then check out our new website where we have a whole page dedicated to events and opportunities from us and our partners:
https://www.colorintech.org/events
|
|
|
|
🙌Discover ARM🙌
Join us on Tuesday, 4th March, at 6:30pm GMT for Discover Arm, a virtual event tailored to experienced engineers who are passionate about innovation and cutting-edge technology.
Hosted in collaboration with Colorintech and Black@Arm, this one-hour session will provide you with:
- An exclusive look into the groundbreaking projects Arm’s engineers are leading.
- Insights into the thriving Black@Arm community.
- An interactive panel discussion featuring distinguished engineers from Arm.
Register here
|
|
|
|
📊 Master the Art of Data: Fully Funded PhD Opportunity
The world is drowning in data, but very few people know how to tell its story. "Big Data Specialist" is currently the fastest-growing job sector globally—and Colorintech has teamed up with Diverse CDT to make sure you’re at the forefront of it.
What’s the deal? Diverse CDT (a joint venture between City St George’s and Warwick University) is looking for the next generation of data pioneers. They are offering fully funded PhDs in Data Visualization with a massive focus on—you guessed it—diversity and inclusion.
Why should you care?
-
Zero Tuition, Full Funding: Get your PhD without the debt.
-
High Demand: Master the top skills employers want: analytical and creative thinking.
-
Real Impact: Learn to turn complex data into visual stories that solve global challenges.
Join the Exclusive Interactive Event: See data in action, meet current students, and find out if a PhD is your next power move.
-
When: Thursday, 5th March | 15:00 – 17:00 GMT
-
Where: City St George’s, Clerkenwell, London
Register here
|
|
|
|
|
|
🙌🏾The latest from the Colorintech team🙌🏾 |
|
|
|
|