|
🗞️Diversity and inclusion news🗞️ |
|
|
📉 UK Tech’s Diversity Report: Less Pipeline, More Plumbing Problems
The UK government just dropped its most comprehensive diversity-in-tech review in years—and, shockingly, it doesn’t look good.
As the global right attacks DEI (👋 Trump 2.0), UK firms are quietly binning Pride logos and “pausing” inclusion budgets. This report lands like a quiet counterpunch: diversity isn’t just morally right, it’s economically critical. But whether it drives action, headlines or neither? That’s still unclear.
📊 Key Findings: What the Report Actually Says
-
Women comprise only 21% of tech teams. One in three women plans to leave their roles due to lack of career progression, poor work-life balance, and unsupportive culture.
-
21% of IT specialists identify as Black, Asian, or an ethnic minority, but there's an 11% drop in representation at senior levels.
-
Only 9% of tech employees come from lower socio-economic backgrounds and earn less than their more affluent peers.
-
14% of the workforce identifies as disabled, but disclosure remains a challenge.
-
Older professionals (50+) are underrepresented, making up only a fifth of the tech workforce, despite being 31% of the working-age population.
-
A "leaky pipeline" exists, where recruitment of diverse entry-level candidates has improved slowly, but this momentum diminishes at mid-career and leadership levels.
-
Underrepresented minorities are twice as likely to leave their jobs due to unfair treatment (e.g., unjust management, stereotyping, harassment) rather than for a better job.
-
20% of men in tech believe women are inherently less suited for these roles.
-
21% of data centers have no women in their technical function at all.
-
Economic downturns have led to budget cuts, diluting or stopping DEI efforts.
Entrepreneurship:
-
The UK startup ecosystem is valued at over $1.1 trillion.
-
92% of angel investments in 2022 were allocated to all-white teams.
-
Women-led startups secured just 2% of annual venture capital (VC) funding, a figure that shows no improvement in the first half of 2024.
-
71% of partners in VC funds are privately educated, highlighting systemic barriers for underrepresented groups.
-
While ethnic representation among UK VCs has seen 100% growth since 2019, significant gaps persist due to low starting points.
-
The Invest in Women Taskforce recently raised over £250 million to support female entrepreneurs.
Skills
-
93% of businesses report a significant gap in IT skills.
-
The digital skills deficit costs the UK economy an estimated £63 billion annually.
-
42% of businesses attribute the skills gap to rapid technological advancements like AI and data analytics.
-
AI-enabled technologies are more likely to impact roles held by women, stressing the need for regular skills development.
-
Bootcamps attract 44% women—but are still viewed as second-tier by many employers
Solutions and Recommendations:
The report emphasizes the need for a coordinated approach between government and industry to address these challenges. Key solutions include:
-
Creating inclusive workplace cultures: Promoting psychological safety, addressing unconscious biases, improving data collection, and fostering authentic role models.
-
Enhancing flexibility: Offering diverse work options.
-
Investing in skills development: Providing training, upskilling initiatives, and transparent promotion practices.
-
Implementing accountability: Tying DEI metrics to leadership performance and enforcing clear reporting processes for discrimination.
-
Targeted interventions for entrepreneurs: Increasing funding and tailored business support for underrepresented founders, and requiring diversity reporting in the VC industry.
-
Improving skills pathways: Developing lifelong learning networks, standardizing competency frameworks (like SFIA or DDAT), and addressing data gaps for social mobility, LGBTQI+, and neurodivergent individuals.
The report concludes that sustained commitment and structural changes are vital to foster an inclusive UK tech sector that drives innovation and economic growth, ensuring opportunities are available to all.
It’s time for joined-up policy, properly funded programmes, and industry-wide accountability. Because without it? The UK won’t just lose talent—it’ll lose its edge.
📥 Read the full DSIT report here
|
|
|
📉👀 Workplace Reporting Bias: Women’s Complaints Still Get Ignored📉👀
New research from UNC, Maryland and Wharton has some uncomfortable receipts for every HR department that likes to brag about its speak-up culture: if you’re a woman, your report of workplace abuse is less likely to be taken seriously than an identical report made by a man.
Researchers analysed thousands of real reports from US federal employees, plus ran five experiments, and the bias was clear:
-
📉 Reports by women were less likely to result in action — especially when they lacked clear supporting evidence (which is common for bullying or harassment).
-
⚖️ When the same report came with corroborating proof (like chat logs), the gender gap disappeared — suggesting credibility bias is the problem, not the content.
-
🧠 It’s not just a people issue, it’s structural: managers have too much discretion and gut instinct fills the evidence gaps with stereotypes about “credibility.”
Why it matters: This shows how bias is baked into the systems that are supposed to protect underrepresented groups. If women’s complaints get discounted, the abuse and bad culture stick around.
What works better? The researchers suggest five fixes:
✅ Separate intake from investigation: Managers shouldn’t decide what goes up the chain — trained teams should.
✅ Standardise how reports are evaluated: Redact details like gender.
✅ Commit to follow up: Every report, big or small, must get a documented outcome.
✅ Multiple safe reporting channels: Don’t force people to rely on their line manager.
✅ Support for reporters: Let people add evidence over time, and make it easy for others to corroborate.
🔍 Big picture takeaway: Good whistleblowing policies are worthless without a fair, bias-proof process. This is more than a HR compliance box — it’s about not gaslighting people who speak up.
You can read more here |
|
|
|
🧠Things that make you go hmmm🧠 |
|
|
📚AI starts to read📚
So… AI gets to read your book, but not pirate it?
In a major U.S. ruling this week, a federal judge has declared that AI companies can legally train their models on copyrighted books without the authors’ permission — under the legal doctrine of fair use😬
Anthropic, the company behind Claude (an OpenAI rival), is the first to receive such a ruling in its favour — and it’s a potential game-changer for dozens of similar lawsuits across the U.S.
But before Silicon Valley throws a champagne party in a pirated PDF file — hold the fireworks. While Judge William Alsup backed Anthropic’s training practices as “transformative,” he also ruled that the company’s decision to download and hoard more than 7 million pirated books in a so-called “central library” did breach copyright. That part is heading to trial in December. 🧑🏽⚖️
Why this matters
-
A win for Big Tech: This gives firms like Meta, Google, and OpenAI a legal boost in their fair use defence — claiming their models don’t copy, but learn.
-
A blow for creatives: Authors, artists, and publishers fighting for compensation now face an uphill battle.
-
UK? Don’t get excited yet: The ruling only applies to the U.S. The UK’s copyright rules are much stricter — and don’t recognise “fair use” in the same way. In fact, the UK creative industry has pushed back hard against similar government proposals.
Meanwhile, in a twist of poetic irony, our editor fed this very ruling into ChatGPT to summarise it — an act the study's author anticipated by planting traps in the text. 🪤
📖 Link to TechCrunch coverage: https://techcrunch.com/2025/06/24/a-federal-judge-sides-with-anthropic-in-lawsuit-over-training-ai-on-books-without-authors-permission/
🗞 Guardian summary: https://www.theguardian.com/technology/2025/jun/25/anthropic-did-not-breach-copyright-when-training-ai-on-books-without-permission-court-rules
🔍 So What?
A patchwork of rulings and appeals is likely before we get any Supreme Court clarity in the U.S. Until then, the precedent is set — and if you're a writer, your book might be helping train AI whether you like it or not.
|
|
|
🔞Age verification🔞
The US Supreme Court just upheld a Texas law forcing people to show a government ID or do a face scan to access online porn. It’s a landmark 6–3 ruling that hands states a green light to push strict age gates — even if that means huge privacy trade-offs.
📜 The Backstory:
The Texas law, passed in 2023, says adult sites must verify users are over 18 to protect minors. Pornhub and other major sites sued, arguing this breaks the First Amendment by making it harder for consenting adults to access legal content — and risks leaks of deeply sensitive personal data.
👨⚖️ The conservative-majority court sided with Texas: “The power to require age verification is within a State’s authority to prevent children from accessing sexually explicit content,” wrote Justice Clarence Thomas.
🌍 Why This Matters:
-
More than a dozen US states are copying Texas’ model. Expect foreign governments to also look at this for their own ends too
-
Sites like Pornhub have already pulled out of states with similar laws — in Texas, they’ve gone dark since March but what do they do when every state potentially requires it?
-
There’s growing evidence that age gates don’t work: people switch to offshore sites or VPNs instead.
🕵️♀️ The Privacy Dilemma:
The adult industry calls this ruling a canary in the coal mine for free speech and digital privacy. Critics say handing over biometric scans or ID details to access sensitive content invites hacks, leaks, or state overreach — with chilling effects on other kinds of adult content too (think LGBTQ+ or sex education resources).
⚖️ What Next?
This sets a big precedent: US states now have a clear Supreme Court signal that they can demand personal data to access “harmful” online content. Expect more global debates on where to draw the line — from porn to violent videos to AI-generated deepfakes.
Oh and why doesn't twitter come under this law?
Well under the Texas law (and similar US state “age verification” laws), sites are covered if they “display more than one-third pornographic content” — meaning explicit sexual material that’s deemed “harmful to minors.”
👉 Why Twitter/X doesn’t get hit (for now):
-
Mixed content: Platforms like Twitter/X have explicit content but it’s not the main purpose of the site. So they typically don’t cross the “one-third porn” threshold in the way that Pornhub does.
-
User-generated loophole: Social platforms can argue they’re not “publishers” of adult content — they host it, but it’s uploaded by users, and moderation is their responsibility.
-
Enforcement mess: Requiring age checks on all user-generated platforms with some adult material (think Reddit, Twitter/X, or Discord) is way more complicated. It raises questions about how to do selective gating — and how you verify age without tracking everyone’s entire browsing history.
Read more here or here
Oh and tinder are at it in California and thats not even for explicit stuff
📉 So what?
Where does platform responsibility end and personal privacy begin? And who decides what’s “harmful enough” to justify surveillance-level checks? We can imagine the companies wont want to ask for much more than they need, but maybe there will be an appeal if they can use the data for more than just age verification
|
|
|
|
📈 The tools behind the tech📉
📦Product📦
📏Design📏
👩🏿💻Code👩🏿💻
🏢The business behind the tech🏢
|
|
|
|
🌐Partner Events & Opportunties 🌐 |
|
|
DOJO!
Tap into tech with our In-Person Recruitment Event in London with Dojo
On Jul 16th, Dojo is collaborating with Colorintech to host an exciting in-person recruitment event, Tap into Tech: Discover your next Technical Role with Dojo.
They’re currently focused on hiring for a range of technical mid-level and above roles (3+ years experience). This includes:
Software Engineers
Product Managers
Data Engineers
This is your opportunity to meet and network with the wonderful team at Dojo including the recruitment team who are happy to answer your questions, whilst enjoying some tasty food and drink.
Check out the key details below:
Date: Wednesday 16th July
Time: 18:30 - 21:30
Where: London
As this event has limited spaces, we'll need you to register your interest to attend using our application form below:
https://form.typeform.com/to/nj2FU475
After applying, I will reach out to you if Dojo has chosen you to attend the event and will provide you with the next steps! Please note, filling out the form does not guarantee you a space at the event.
|
|
|
🙌Summer time!🙌
Interested in a night of career insights, rooftop networking, and games with the Colorintech Community
On Thursday 24th July, Colorintech is teaming up with Samsara for a night of insightful conversations on careers and progression at companies of all sizes, live music and even a ping pong tournament!
In our last two community surveys, we've seen an increasing desire for talks about what it’s like to work at startups and scaleups as well as corporates…
Now we’re making it happen.
Our beachside chat aims to tackle this topic by asking:
What's it like to adapt and thrive in different company phases;
What are the benefits and challenges to a career focused on any of these stages;
How do you know which option may be a good choice for you.
To answer these topics, we have an inspiring set of panellists including:
Adnan Omar: Chief Content Officer at PRYNTD
Munaaf Ghumran: Enterprise Sales Engineering EMEA @ Samsara
Nyasha Duri: AI Security Researcher at Apart Research
Pearlé Nwaezeigwe: Strategic Marketing Consultant @ Wae Collective
As it’s the start of summer, we plan to bring the good summer vibes on Samsara's wonderful rooftop - let's hope the weather is warm and bright! So we’ve planned:
Live Music,
Activities like a Ping Pong tournament,
Rooftop drinks!
Whether you’re a seasoned professional, early in your career, or a founder thinking about hiring, this is a fantastic opportunity to connect with our wonderful Colorintech community, including our friends at Samsara!
Check out the key details below:
Date: 24th July 2025
Time: 18:00 - 21:30 BST
Where: London, E1
As this event has limited spaces, we'll need you to sign up using our Luma Link below:
https://lu.ma/Ascent-of-Work
|
|
|
A case study
In April 2024, Colorintech and Captify joined forces to collaborate on a fantastic event where we gathered a group of Colorintech community members to network and provide insights on careers in sales, engineering, data and product! It was truly a great evening!
Now, we're pleased to celebrate that one of our wonderful community member's Robert Onuma has joined the Captify Insights team as a Junior Insights Analyst and has successfully passed probation. This just goes to show the power of networking and connecting at events! Shout out to the awesome Captify team involved in making the event happen Sabrina, Abbie, Baltej, Roksolana and our Community and Events Lead, Catherine, for supporting them along the way!
|
|
|
|
🙌🏾The latest from the Colorintech team🙌🏾 |
|
|
|
|
|