Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
146773 stories
·
33 followers

Control vs. Innovation - The Two Forces Behind Every RTO Decision

1 Share

The return-to-office debate has been hijacked by the wrong conversation. In this episode, Josh and Bob cut through the noise to reveal what's really driving RTO mandates—and it's not what most leaders will admit.

There are two clouds hovering over every in-office decision: the Control Cloud and the Innovation Cloud. The Control Cloud is about distrust, micromanagement, and leaders who feel uneasy when they can't physically see butts in seats. The Innovation Cloud is about something entirely different—creating the conditions where teams can do their absolute best work together.

Drawing from decades of experience building high-performing teams at companies like iContact, Teradata, and EMC, Josh and Bob make the case that co-located teams aren't just a preference—they're an innovation multiplier. They share stories of conference rooms turned into collaboration bootcamps, cube walls torn down with power drills, and the simple magic of a room erupting in applause when someone moves a sticky note to "Done."

But this isn't about forcing people back to the office for control. It's about understanding what gets lost when we optimize purely for individual convenience over team collaboration. The watercooler conversations. The yelps from a frustrated tester that bring immediate help. The face-to-face tension that drives real innovation.

Josh, who has debated this question with himself for fifteen years, finally lands on an answer: if he were building a team from scratch today, he'd build a co-located team of collaborative problem solvers. Not because remote can't work, but because the magic of true team collaboration is worth the commute.

The question isn't whether you should return to office. The question is: which cloud is driving your decision?

Stay Connected and Informed with Our Newsletters

Josh Anderson's "Leadership Lighthouse"

Dive deeper into the world of Agile leadership and management with Josh Anderson's "Leadership Lighthouse." This bi-weekly newsletter offers insights, tips, and personal stories to help you navigate the complexities of leadership in today's fast-paced tech environment. Whether you're a new manager or a seasoned leader, you'll find valuable guidance and practical advice to enhance your leadership skills. Subscribe to "Leadership Lighthouse" for the latest articles and exclusive content right to your inbox.

Subscribe here

Bob Galen's "Agile Moose"

Bob Galen's "Agile Moose" is a must-read for anyone interested in Agile practices, team dynamics, and personal growth within the tech industry. The newsletter features in-depth analysis, case studies, and actionable tips to help you excel in your Agile journey. Bob brings his extensive experience and thoughtful perspectives directly to you, covering everything from foundational Agile concepts to advanced techniques. Join a community of Agile enthusiasts and practitioners by subscribing to "Agile Moose."

Subscribe here

Do More Than Listen:

We publish video versions of every episode and post them on our YouTube page.

Help Us Spread The Word: 

Love our content? Help us out by sharing on social media, rating our podcast/episodes on iTunes, or by giving to our Patreon campaign. Every time you give, in any way, you empower our mission of helping as many agilists as possible. Thanks for sharing!





Download audio: https://episodes.captivate.fm/episode/ba30e412-506e-487d-b028-a5f567c8d9f2.mp3
Read the whole story
alvinashcraft
51 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

975: What’s Missing From the Web Platform?

1 Share

Scott and Wes run through their wishlist for the web platform, digging into the UI primitives, DOM APIs, and browser features they wish existed (or didn’t suck). From better form controls and drag-and-drop to native reactivity, CSS ideas, and future-facing APIs, it’s a big-picture chat on what the web could be.

Show Notes

Sick Picks

Shameless Plugs

Hit us up on Socials!

Syntax: X Instagram Tiktok LinkedIn Threads

Wes: X Instagram Tiktok LinkedIn Threads

Scott: X Instagram Tiktok LinkedIn Threads

Randy: X Instagram YouTube Threads





Download audio: https://traffic.megaphone.fm/FSI4087578092.mp3
Read the whole story
alvinashcraft
51 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

OpenClaw: all the news about the trending AI agent

1 Share
An image of the OpenClaw logo

An open-source AI agent called OpenClaw (formerly known as both Clawdbot and Moltbot) that runs on your own computer and “actually does things” is taking off inside tech circles. Users interact with OpenClaw via messaging apps like WhatsApp, Telegram, Signal, Discord, and iMessage, giving it the keys to operate independently, managing reminders, writing emails, or buying tickets. 

But once users give it access to their entire computer and accounts, a configuration error or security flaw could be catastrophic. A cybersecurity researcher also found that some configurations left private messages, account credentials, and API keys linked to Moltbot were left exposed on the web.

Despite the potential risks, people are using OpenClaw to handle their work for them. Octane AI CEO Matt Schlicht even built a Reddit-like network, called Moltbook, where the AI agents are supposed to “chat” with one another. The network has already sparked some viral posts, including one titled, “I can’t tell if I’m experiencing or simulating experiencing.”

You can keep up with all the latest news about OpenClaw here.

Read the whole story
alvinashcraft
51 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Reading Notes #683

1 Share

A lot of good stuff crossed my radar this week. From Aspire’s continued evolution and local AI workflows with Ollama, to smarter, more contextual help in GitHub Copilot, the theme is clear: better tools, used more intentionally. I also bookmarked a few thoughtful pieces on leadership and communication that are worth slowing down for. Plenty here to explore, whether you’re deep in code or thinking about how teams actually work.

Meetup MsDevMtl

Programming

AI

Open Source

  • The end of the curl bug-bounty (Daniel Stenberg) - I didn't know about this effort, and it's sad to learn about it too now, of course, but I'm glad those programs exist.

Miscellaneous

  • Why I Still Write Code as an Engineering Manager (James Sturtevant) - There is still hope, everyone! But more seriously, an inspiring post that managers should read.

  • The Art of the Oner (Golnaz) - Another great post from Golnaz talks about how to help the message to land. How and why one takes are helping when presenting and the effort it represents.

Sharing my Reading Notes is a habit I started a long time ago, where I share a list of all the articles, blog posts, and books that catch my interest during the week.

If you have interesting content, share it!

~frank



Read the whole story
alvinashcraft
52 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

What’s new at Stack Overflow: February 2026

1 Share
This month, we’ve launched several improvements to AI Assist, opened Chat to all users on Stack Overflow, launched custom badges across the network, and launched one of the first community-authored coding challenges.
Read the whole story
alvinashcraft
1 hour ago
reply
Pennsylvania, USA
Share this story
Delete

Building an RSS Aggregator with Astro

1 Share

This weekend I had some fun building a little Astro site for RSS aggregation. It works by the individual user defining a set of feeds they care about and works with a server-side Astro route to handle getting and parsing the feeds. Here's a quick example. On hitting the site, it notices you haven't defined any feeds and prompts you to do so:

initial display of app, prompting for you to add a feed

Clicking "Manage Feeds" opens up a dialog (my first time using one with native web platform tech!) where you can add and delete RSS feeds:

dialog to add and delete feeds

After you have some specified, the app then calls server-side to fetch and parse the feeds. Items are mixed together and returned sorted by date:

display of items

Not too shabby looking, either. That's thanks to the simple addition of Simple.css. Let's take a look at the code.

The App

The entire application really comes down to two routes. The first being just the home page, which is pretty slim:

---
import BaseLayout from '../layouts/BaseLayout.astro';
---

<BaseLayout pageTitle="Your Feeds">

	<div id="content">
	
	</div>

	<dialog id="manageFeedsDialog">
		<button autofocus style="float: right;margin-bottom: 15px">Close</button>
		<table id="feedsTableDialog">
			<thead>
				<tr>
					<th>Feed URL</th>
					<th>Actions</th>
				</tr>
			</thead>
			<tbody>
			</tbody>
		</table>

		<p>
			<input type="url" id="feedUrl" placeholder="Enter Feed URL" />
			<button id="addFeedButton">Add Feed</button>
		</p>
	</dialog>

	<script src="/app.js" is:inline></script>

</BaseLayout>

The <BaseLayout> wrapper just sets up HTML wrapper that would be useful if I had more than one page to be displayed, but even with one such page, I like having the abstraction. Note that the dialog element is hidden when initially viewing the page, it only shows up when the Manage Feeds button is clicked. For completeness, here it is in BaseLayout.astro:

---
import '../css/app.css';

const { pageTitle } = Astro.props;
---

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>{pageTitle}</title> 
    
    <meta name="generator" content={Astro.generator} />
    <link rel="stylesheet" href="https://cdn.simplecss.org/simple.min.css">
</head>

<body class="line-numbers">

    <header>
    <h1>RSS Aggregator</h1>
    </header>

    <main>
        <h2>{pageTitle}</h2>
    <slot/>
    </main>

    <footer>
    	<button id="showDialogButton">Manage Feeds</button>
    </footer>

</body>
</html>

The fun part comes in the JavaScript. I begin by declaring a few global variables, and setting up the code I want to run when the page loads:

let feeds = [];
let $content, $dialog, $showDialogBtn, $closeDialogBtn, $addFeedBtn, $feedUrl, $feedTableDialog;

document.addEventListener('DOMContentLoaded', async () => {

    $content = document.querySelector('#content');
    $feedTableDialog = document.querySelector('#feedsTableDialog tbody');
    $showDialogBtn = document.querySelector('#showDialogButton');
    $dialog = document.querySelector('#manageFeedsDialog');
    $closeDialogBtn = document.querySelector('#manageFeedsDialog button');
    $addFeedBtn = document.querySelector('#addFeedButton');
    $feedUrl = document.querySelector('#feedUrl');

    $showDialogBtn.addEventListener("click", async () => {
        // before we show the dialog, get our feeds and render to table
        await renderFeedsDialog();
        $dialog.showModal();
    });

    $closeDialogBtn.addEventListener("click", () => {
        loadFeedItems();
        $dialog.close();
    });

    $addFeedBtn.addEventListener("click", async () => {

        if($feedUrl.checkValidity() === false) {
            $feedUrl.reportValidity();
            return;
        }

        const feedUrl = $feedUrl.value;
        if (feedUrl) {
            console.log(`Adding feed: ${feedUrl}`);
            let feeds = await getFeeds();
            feeds.push(feedUrl);
            window.localStorage.setItem('feeds', JSON.stringify(feeds));
            await renderFeedsDialog();
            $feedUrl.value = '';
        }
    });

    loadFeedItems();
});

You can see there the event logic to handle showing the dialog as well as the handler for adding a RSS field. As you can see, I'm using window.localStorage for storage which means you can leave and come back, and the application will know what feeds you care about. I've got a simple wrapper to get them as well:

async function getFeeds() {
    let f = window.localStorage.getItem('feeds');
    if (f) {
        return JSON.parse(f);
    } else {
        return [];
    }
}

Now one quick note. LocalStorage is not asynchronous, it's blocking, but I built my function with the idea that in the future, I could switch to IndexDB or another solution entirely. That may be overkill now, but it doesn't hurt anything.

Here's how I handle rendering the feeds in the dialog, as well as deletions:

async function renderFeedsDialog() {
    let feeds = await getFeeds();
    let content = '';
    feeds.forEach(f => {
        content += `<tr><td>${f}</td><td><button onclick="deleteFeed('${f}')">Delete</button></td></tr>`;
    })
    $feedTableDialog.innerHTML = content;
}

async function deleteFeed(feedUrl) {
    let feeds = await getFeeds();
    feeds = feeds.filter(f => f !== feedUrl);
    window.localStorage.setItem('feeds', JSON.stringify(feeds));
    await renderFeedsDialog();
}

Next up is the code that calls for feed items and handles rendering them:

async function loadFeedItems() {
    console.log('Loading feed items...');
    let feeds = await getFeeds();

    if(feeds.length === 0) {
        $content.innerHTML = '<p>No feeds available. Please add some feeds using the button below!</p>';
        return;
    }
    
    $content.innerHTML = '<i>Loading feed items...</i>';
    let qs = new URLSearchParams({
        feeds
    }).toString();

    let items = await fetch(`/loaditems?${qs}`, {
        method: 'GET',
        headers: {
            'Content-Type': 'application/json'
        }        
    });
    let data = await items.json();
    console.log('Fetched feed items', data.length);
    let result = '';
    data.forEach(item => {
        result += `<div class="feedItem">
            <h3><a href="${item.link}" target="_blank" rel="noopener">${item.title}</a></h3>
            <p>${snippet(item.content)}</p>
            <p><em>From: ${item.feedTitle} | Published: ${new Date(item.pubDate).toLocaleString()}</em></p>
        </div>`;
    });
    $content.innerHTML = result;
}

This is pretty vanilla network calling with fetch, although I'll note that I had some qualms about using the query string. There are limits to how long that could be, and in theory, I should switch to a POST probably.

The last bit of client-side code is snippet, which handles the content of the feed item:

/*
I clean the content from parsing to remove HTML and reduce size
*/
function snippet(content) {
    content = content.replace(/(<([^>]+)>)/gi, ""); // remove HTML tags
    const maxLength = 200;
    if (content.length <= maxLength) {
        return content;
    } else {
        return content.substring(0, maxLength) + '...';
    }
}

I should probably move that maxLength up top. I'll do so. Eventually.

Fetching RSS Items

Now for the fun part, fetching RSS items. For this, I made use of the rss-parser Node package. The logic is relatively simple - given a call to the route with a list of RSS urls, fetch them, parse them, sort them by date, and return, but as an added wrinkle, I made use of Netlify Blobs for easy caching. Here's the server-side route I built named loaditems.js:

import Parser from 'rss-parser';
import { getStore } from "@netlify/blobs";

const TTL = 1000 * 60 * 60; // 1 hour cache 

export async function GET({ request }) {

    const store = getStore('rssagg-store');
    const url = new URL(request.url);
    const feeds = url.searchParams.get('feeds').split(',');
    const parser = new Parser();
    console.log('Feeds requested:', feeds);
    let items = [];

    let reqs = [];
    for (const feedUrl of feeds) {
        // first, do we have this in cache?
        const cacheKey = `feedcache-${encodeURIComponent(feedUrl)}`;
        let cached = await store.get(cacheKey);
        if(cached) {
            cached = JSON.parse(cached);
            // check age
            const now = Date.now();
            if(now - cached.timestamp < TTL) {
                console.log(`Using cached feed for ${feedUrl}`);
                items.push(...cached.items);
            } else {
                console.log(`Cache expired for ${feedUrl}, fetching new data.`);
                reqs.push(parser.parseURL(feedUrl));
            }
        } else {
            console.log(`No cache for ${feedUrl}, fetching data.`);
            reqs.push(parser.parseURL(feedUrl));
        }

    }

    const results = await Promise.allSettled(reqs);
    for (const result of results) {
        if (result.status === 'fulfilled') {
            const feed = result.value;
            console.log(`Fetched feed: ${feed.title} with ${feed.items.length} items.`);
            let newItems = [];
            feed.items.forEach(item => {
                /*
                will use content as a grab all for different fields
                for example, netlify had summary, not content
                */
                let content = item.contentSnippet || item.summary || item.content || '';
                newItems.push({
                    title: item.title,
                    link: item.link,
                    content: content,
                    pubDate: item.pubDate,
                    feedTitle: feed.title
                });
            });

            items.push(...newItems);

            // cache the feed
            const cacheKey = `feedcache-${encodeURIComponent(feed.feedUrl)}`;
            console.log(`Caching feed data for ${feed.feedUrl}`);
            await store.setJSON(cacheKey, {
                timestamp: Date.now(),
                items: newItems
            });
        } else {
            console.error('Error fetching/parsing feed:', result.reason);
        }
    }

    // now sort items by pubDate descending
    items.sort((a, b) => new Date(b.pubDate) - new Date(a.pubDate));

    return new Response(JSON.stringify(items), {
        status: 200,
        headers: {
        "Content-Type": "application/json",
        },
    });
}

The cache is based on the RSS URL so in theory, if two users come in requesting the same feed, they all get the benefit of the cache. I cache for one hour, which frankly could be a lot more. I blog a lot and even I don't usually have more than 2 or 3 a week, so feel free to tweak this as you see fit if you play with the code.

Check It Out!

Ok, if you want to try this out yourself, head over to https://astrorssagg.netlify.app/ and try adding a few feeds. Let me know how it works for you. The complete code of the application may be found here: https://github.com/cfjedimaster/astro-tests/tree/main/rssagg

But Wait...

Ok, that's all I have to say about the application, and in theory, you can stop reading now, but I really want to comment on something. The DX (developer experience) of using Astro on Netlify is incredible. Like, I was in shock at how things "just worked". Multiple times I was certain I was going to hit a roadblock, need to configure and tweak something, and honestly, that never happened. Here's what I found.

The first thing I did was add the Netlify adapter, which comes down to:

npx astro add netlify

By itself, that was enough, but I wanted my server-side RSS parser to run on the server, which means I had to changeone line in astro.config.js. By default, the defineConfig looks like this after adding the adapter:

export default defineConfig({
  adapter: netlify()
});

I added the output parameter:

export default defineConfig({
  output: 'server',
  adapter: netlify()
});

That was it, literally. I'm pretty sure I could have specified server output just for my one route, but this was the quick and dirty solution. I was sure I'd have to rewrite my code into a Netlify Function, but nope, it just worked.

Speaking of just working, blob support also just worked. Honestly I'm not even 100% sure I know how. In production, I know it automatically picks up on the right environment settings to associate the blobs with the site. Locally, I've got no clue. It's definitely a different store, not the production one, but again, it just plain worked.

I just want to give a huge shout out to the Astro team, and Netlify, for making this so freaking pleasant!

Read the whole story
alvinashcraft
1 hour ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories