- Your Tech Career. Your Way Forward
- Posts
- From Geek to Star #34 - The questions worth sitting with at the end of the year
From Geek to Star #34 - The questions worth sitting with at the end of the year
Before planning 2026, a pause to reflect on what 2025 really changed in Tech & AI
“We shape our tools, and thereafter our tools shape us.”
If you missed the previous episodes, you can access them online here.
🗓️ This Week – Episode 34: The questions worth sitting with
This period of the year is often the moment when we feel tempted to jump straight into planning for next year.
Before doing that, I believe there is value in something else: sitting with a few good questions to reflect before moving forward again.
2025 has been a dense year for Tech and AI. I will share with you what I think mostly changed in 2025, and then invite you to reflect on what this means for you as a tech leader, an engineer, or someone working closely with tech.
What 2025 changed in Tech & AI (a few signals I spotted)
AI moved from experiments to infrastructure
Early 2025 confirmed what many sensed: AI stopped being a side project and became embedded into everyday tools and workflows. Deloitte’s Tech Trends 2025 described AI as no longer an “add-on”, but something woven into the fabric of enterprise systems.
Talking with tech leaders across different types of companies, I do feel however that, as often, those who have been completely changing how they deliver software with AI are startups (when having a CEO with a strong engineering background) and scaleups. In more traditional companies, things are more happening in pockets here and there, as non tech C-level executives are still far from understanding the level of overhaul required.
Speed increased faster than judgement
As adoption accelerated, productivity gains have been real but so have been the limits it seems. Multiple analyses pointed out that organisations scaled AI faster than their ability to ensure context, understanding, and decision quality - not talking about the potential increase in technical debt and cybersecurity challenges.
Source (research perspective): https://arxiv.org/abs/2504.18081
AI became a leadership and governance topic
By mid-2025, AI discussions clearly moved beyond engineering teams. Global summits and regulatory efforts have brought ethics, risk, accountability, and explainability into the conversation. AI is becoming an essential element of geopolitics, when for many years in tech we always considered that tech was rather neutral. For example, the AI Action Summit in Paris gathered governments, companies, and civil society around these questions. In Singapore, where I live, government officials always point out the work they are doing to keep a balance between giving enough freedom for innovation to happen and regulation to ensure things do not derail at the detriment of the Singaporean society.
Autonomy raised new questions
Agentic and semi-autonomous AI workflows gained traction. The discussion shifted from “what can AI do?” to “how much autonomy are we ready to delegate and under what guardrails?”
This is pushing us tech leaders closer to governance, trust, and responsibility topics.
Consequences became harder to ignore
Late 2025 saw increased attention on environmental, societal, and regulatory impacts of AI, including energy and water consumption, as well as new safety regulations.
The questions worth sitting with
Looking at these shifts happening, here are few questions I think worth thinking about for all of us working in Tech:
Where did technology genuinely enable better outcomes this year in your company, around you… and where did it just become an end in itself or even generate more negative results than positive results?
When did speed help me and when did it reduce my own understanding / mastery more than I would actually hope?
How did AI change how I think, not just how fast I work?
Where am I still perceived as “just tech”? Do I feel that with AI my leadership and stakeholders think I have more value or that I have less value? What did I do or not do to influence that perception?
When did I step into broader conversations about impact, risk, or responsibility and when did I stay in my comfort zone?
What decisions this year felt ethically complex rather than technically difficult?
If 2025 repeated itself, what would I intentionally do differently around my approach to Tech and AI?
🙏 I’d Love to Hear From You
Any thoughts on the questions above, do share them with me - writing back to me is also a process which helps you to better reflect.
Reply to this email, I read every note.
Follow me on LinkedIn for more reflections and “behind-the-scenes” thinking between newsletters. Don’t hesitate to comment or reshare, it’s one of the best ways to grow your SHINE 🌟. If you want to know more about how I can support you 1-1 to thrive in your tech career, have a look at my offerings here.
P.S. Referral Pilot 🚀
Forward this email to one engineer or tech friend who might need a reminder: strong roots make lasting growth.
✨ May the SHINE be with you!
From Geek to Star by Khang | The Way Forward
Reply