Content Developer II at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
121739 stories
·
29 followers

.NET-Centric Uno Platform Debuts 'Single Project' for 9 Targets

1 Share
"We've reduced the complexity of project files and eliminated the need for explicit NuGet package references, separate project libraries, or 'shared' projects."
Read the whole story
alvinashcraft
1 hour ago
reply
West Grove, PA
Share this story
Delete

Azure SDK Release (April 2024)

1 Share

Thank you for your interest in the new Azure SDKs! We release new features, improvements, and bug fixes every month. Subscribe to our Azure SDK Blog RSS Feed to get notified when a new release is available.

You can find links to packages, code, and docs on our Azure SDK Releases page.

Give Feedback

If you’ve been using the Azure SDK, we’d love to hear your thoughts! Complete the survey!

Initial Stable Releases

  • Client library for .NET:
    • Dev Center
  • Client library for Go:
    • Cosmos DB
    • Event Grid
  • Client library for Java:
    • Azure XML
  • Client library for Python:
    • Azure Communication Messages
  • Management library for .NET:
    • Data Factory
    • Sphere
  • Management library for Go:
    • IoT Firmware Defense
    • Sphere
  • Management library for Java:
    • IoT Firmware Defense
    • Sphere
  • Management library for JavaScript:
    • IoT Firmware Defense
    • Sphere
  • Management library for Python:
    • IoT Firmware Defense
    • Sphere

Initial Beta Releases

  • Client library for .NET:
    • Microsoft Purview
  • Management library for .NET:
    • Device Registry
    • Migration Discovery SAP
    • Standby Pool
  • Management library for Go:
    • Azure Storage Actions
    • Migration Discovery SAP
    • Workloads SAP Virtual Instance
  • Management library for Java:
    • Azure Storage actions
    • Workloads SAP Virtual Instance
    • Migration Discovery SAP
  • Management library for JavaScript:
    • Azure Storage Actions
    • Migration Discovery SAP
    • Workloads SAP Virtual Instance
  • Management library for Python:
    • Azure Storage Actions
    • Migration Discovery SAP
    • Workloads SAP Virtual Instance

Supportability

Python libraries stopped supporting Python 3.7 in December 2023. The next minor version, Python 3.8, is supported until April 2025. For more information, see the Azure SDK for Python version support policy.

Numerous older Azure SDK libraries, which weren’t compliant with the Azure SDK Guidelines, were retired (deprecated) in 2023. Our next deprecation milestone is March 2024. While no code stops working, support and updates end when a library reaches retirement. You can see a list of retired libraries along with their replacement libraries at Azure SDK Deprecated Releases.

Release notes

For a list of language-specific release notes, see the following links:

The post Azure SDK Release (April 2024) appeared first on Azure SDK Blog.

Read the whole story
alvinashcraft
1 hour ago
reply
West Grove, PA
Share this story
Delete

Introducing Augment: a company dedicated to empowering developers with AI

1 Share

I’m incredibly excited to share that Augment, the company I joined to help empower developers, has come out of stealth.

With a lot of FUD around AI taking all of the knowledge worker jobs, including those of developers, I believe it is important to get across the counter argument:

“Don’t fire Kevin for Devin just yet. Augment Kevin with super powers!”

Me

If you think about what software engineers actually do and what AI excels at, you should reach the same conclusion. It’s easy to anthropomorphize AI tools, especially when you’re chatting with them and considering their portrayal in science fiction. With that in mind, I believe in creating systems that resemble J.A.R.V.I.S more than HAL.

As we develop these systems, it’s essential to remember that humans and computers have unique strengths. The real magic happens when humans take charge, supported by ever-present, fully connected computer systems.

By doing so, we can not only improve life for developers individually, but also empower teams and organizations to accomplish much more with reduced toil and communication costs.

I’m passionate at doing my part to help here, and I want to share my journey to Augment with you.

Seeing the future of software development

I love programming. Whenever I write some code, it tends to be a good day. There is something about the creative process that ends with something tangible that is good for my brain. Any platforms, tools, or services that allow me to stay in that certain flow of development become favorites. There is an art to taking an idea, breaking it down, and making progress.

The longer I am on the path to running code that works – or getting effective help back onto the path when it isn’t working – the better I feel.

On the flip side, whenever I am doing something that feels like toil, or I feel really stuck, the worse I feel.

There have been a couple of times when I saw how AI technology could dramatically help:

  • I worked with a research team inside X at Google who built models (in the pre-LLM/transformer days) that could help the highly skilled SWEs keep up with the constantly evolving monorepo. This was often very boring work, ripe for a computer to help.
  • I worked on a project at Shopify that uses LLMs to bridge the complexity of GraphQL for developers wanting to integrate with merchant data. This quickly taught me lessons, such as:
    • It’s easy to show a cool (somewhat contrived) demo
    • It’s hard to build something great that works at scale in the real world
    • One LLM isn’t the answer for all use cases
    • It’s not just quantity… quality data matters
    • Having a system that can really do well wrt evaluations is vital as you iterate

Projects like these gave me the evidence to see how software engineering is going to radically change in the future, and pairing AI technology with developers will be the driver.

Meeting the Augment team

I was sold on the opportunity that this AI wave could allow us to help developers in new expansive ways. I started to explore, and this exploration lead me to chatting with a couple old friends, Luke Wroblewski and Sam Pullara who are building companies at Sutter Hill Ventures, a pretty unique VC firm.

Luke and Sam grinned as I spoke about my desire to build for developers with AI, and quickly introduced me to the founders and team behind Augment.

I met Guy Gur-Ari, the co-founder leading the research efforts at Augment. He had already assembled a team of AI researchers and engineers who had many years of expertise with ML and how it can be applied to code. This was important to me, as I had found that to build something truly great, you need the ability to make changes across the entire stack. You want to be able to change the engine along with the other parts of the car!

Igor Ostrovsky, the other co-founder and pioneer of Augment, also gave me a lot of faith that we had the broad technical expertise to pull this off at scale. His proven track record with distributed systems as Chief Architect of Pure Storage, developer focused work at Microsoft, and his deep dive into AI as an entrepreneur in residence with SHV was inspiring.

Then I discovered that Scott Dietzen had joined as CEO. I first met Scott at the birth of enterprise Java, where he was CTO at BEA WebLogic, my favorite app server of choice.

As I met the broader team, I had a strong feeling that this was a team with the focus, experience, and skill to take a shot at building the best AI platform and ecosystem for developers.

The team had gone deep in building foundational technology that is needed to solve the meaty problems that developers have, especially at scale. These include building a system that:

Has an expert understanding of large codebases

There are solutions out there that feel like you have access to a system aware of core technology. They have a solid understanding of programming languages, and popular frameworks.

When using Augment, we want you to feel like you are working with the joint intuition of your most seasoned engineers at the company, and those with deep expertise on the dependencies that you use.

Any suggestions need to reflect the APIs and coding patterns in your company’s code so your team can use it on your actual day-to-day work.

Produces running code

The custom AI models and infrastructure are tuned for code and coding use cases avoiding frustrating hallucinations and focuses on improving code quality… not just productivity.

Operates at the speed of thought

There were many search engines before Google, but I remember trying it for the first time, and seeing how the experience was a step change. The quality of the results were next level AND the speed to return them felt different.

Working with LLMs can be a lil… slow, which massively degrades the experience and can keep knocking you out of flow.

The team had built a fast inference — 3x faster than competitors — built on state-of-the-art techniques, including custom GPU kernels, and I felt the difference in the experience.

Supports multiple developers & teams

Software development is a team sport. There are so many areas where technology can help scale and improve the use of best practices across a team, help you learn a complex codebase, and get new engineers onboarded faster.

The scale of computers allow a system to attend to do much more, and they are available 24×7.

I have learned the power of small teams. We have seen with early customers that the shape of teams can change when you deliver the right capabilities. If we can enable smaller teams to do more, and for teams to do more in parallel, we result in better software and happier devs to boot!

Includes strong IP protections

Your company’s source code is precious. Augment was designed from the first line of code for tenant isolation, with an architecture built to protect your IP.

Try Augment

Joining Augment has already been a blast. Moving at startup speed with a great crew all focused on helping developers is a dream come true for me. I feel very fortunate to have the opportunity to go after this problem space with a small (but growing! Join us?) team.

We are heads down delivering on our promise, working closely with early access customers, who have been a key part of our product development thanks to their fantastic feedback (thank you!).

We are furiously working our way to a public product launch that we can’t wait to share.

Until then, if you are interested in kicking the tires early, please sign up for the waitlist!

Read the whole story
alvinashcraft
1 hour ago
reply
West Grove, PA
Share this story
Delete

Vertical Slice Architecture Myths You Need To Know!

1 Share

As with anything that gains traction from a wider community, some misconceptions develop and spread. Vertical slice architecture falls into that trap. Don’t worry. I’ve got you covered to demystify some common Vertical Slice Architecture myths and why they likely exist. Let’s set the record straight.

YouTube

Check out my YouTube channel, where I post all kinds of content accompanying my posts, including this video showing everything in this post.

Share Nothing

Easily, the biggest of the Vertical Slice Architecture myth is that you cannot share anything between features. That’s always the question I get in the comments. “How do I share cross-cutting concerns?” or “But I’ll end up duplicating logica all over the place!”

I think this misconception exists because of the belief that features should be entirely independent of other features. While that’s somewhat true is that if you’re focusing on capabilities, you’ll have some that have some features that have nothing to do with others. However, you will have features that relate to each other, which might require something shared.

An example of this is the underlying domain or data model related. Another example is related to workflow, where one feature will kick off another feature.

Also, sometimes, I find people think of features as too granular, which also can cause this confusion. A feature can be a combination of commands and queries, but it can be as small as an individual command or query (think of a report).

In the example above, you might have two features that share the same validation while all the features share the same underlying model. That could be a data model or more of a domain model.

There’s nothing wrong with this. Organizing by features focuses on your system’s capabilities (more on this later).

But the key point is that you don’t have a massive underlying model that spans the entire system. Instead, you have a model that’s focused and used specifically for the features that need it. What you share becomes very focused on the features.

One of the benefits of vertical slices is your ability to choose per feature it’s implementation details that may differ from other features. For example, you may choose to use a domain model with an aggregate when you implement a command, while you may choose to query the database directly for a query. Those two might use different libraries (dependencies) for each concern. That’s not to say they must, but it’s an option.

Little to no abstraction

I think this myth exists because people think of Vertical Slices as the opposite of Layers or as that it’s at odds with Clean or Onion Architecture. But that’s not the case.

The point of vertical slices is about cohesion. Clean (or Onion) Architecture is about the direction of dependencies and how you manage coupling. They are orthogonal concerns and not mutually exclusive.

For example, you can still have a data access layer or some abstraction over data access. The question is do you need that abstraction?

As an example, if you were using ORM and it had 100’s usages all over your application layer. If you want to change that ORM with some other form of data access, that might be a massive undertaking to change all those usages. This is because you’re highly coupled from your application layer to the ORM. If you hide the ORM behind some data access layer, it’s still the same problem. The problem is the high degree of coupling. Coupling here is the issue.

However, if you have a dozen usages for a set of related features that use that ORM, and you want to change it to something else or change the underlying database itself, that surely will be easier to manage that change for a dozen usages instead of 100s.

The point is that you’re segregating and deciding how you couple per feature rather than an entire layer.

This is why I think the misconception exists because once you start getting narrow in features, you start realizing the value of abstractions decreases because you have less coupling to certain dependencies.

Low barrier to entry

Vertical Slice Architecture is easier than something like Clean Architecture. I disagree with this because as I already mentioned, they aren’t mutually exclusive. It’s about fundamentally understanding coupling and cohesion and the trade-offs you have to make.

Your systems architecture is comprised of various architectural patterns, such as vertical slices, layered, event-driven, etc.

Ultimately, it’s understanding what the problems are and applying various patterns that help solve those specific problems. Understanding the trade-offs you are making in doing so, often around coupling and cohesion.

On the surface, it might seem easier to understand Vertical Slices, however, because you can make decisions and have more options per slice, but that comes with the trade-offs, such as having inconsistent ways of implementing features. It’s a double-edged sword.

It’s not easier to understand because you can screw up anything you don’t truly understand.

Features don’t impact other features

This is another really popular Verital Slice Architecture myth. This myth goes back to the misconception of sharing nothing. You’re going to have some coupling between features. Often times, this is because of business processes or workflows where one feature ends, and that’s the starting point for another. Your workflows, while different vertical slices still have some degree of coupling, it just depends on how.

I do agree that you’re less likely to impact other features that are unrelated because you don’t end up sharing various cross-cutting concerns over all your features. As mentioned earlier in my ORM example, because your limiting coupling to specific features to that dependency, it’s just those features can be affected if you make a change. Meaning if have one massive data access layer that’s coupled to everything, everywhere, and you make a change to that data access layer, you’re going to affect all kinds of usages. But as mentioned, since your features decide themselves and limit the coupling to shared concerns, you’re limiting the impact of change.

Requires Messaging for decoupling

And to piggy back on the last misconception, you’re going to have coupling between features, especially workflows. But that doesn’t mean you need messaging or have to be using an event-driven architecture.

This misconception exists because using the publish-subscribe pattern and an event-driven architecture is a great way to remove temporal coupling. You sill have coupling, but you just removed the temporal aspect.

A lot of the illustrations of vertical slices will show the usage of messaging or an event-driven architecture of this reason. But that doesn’t mean vertical slices architecture requires it.

Your architecture is composed of all kinds of different architectural patterns. If you need to remove temporal coupling and want to loosely couple between features or groups of of features, than event-driven architecture can be a solution to that problem. But Vertical Slices do not require it. You could still have one feature make an in-memory call via some abstraction another feature exposes. It depends if that’s appropriate for your use case.

CRUD isn’t use-case driven

This isn’t a myth; I just wanted to add it because I think it’s the reverse. Vertical slice architecture is about focusing on your use cases. CRUD is the exact opposite of that. CRUD doesn’t capture intent. When you organize code around features, you’re explicitly capturing your system’s capabilities.

If you’re simply doing Create-Read-Update-Delete and generating an API and UI around a database, do you really need Vertical Slices? Not really. The value to me is capturing the capabilities, workflows, and business processes.

Join CodeOpinon!
Developer-level members of my Patreon or YouTube channel get access to a private Discord server to chat with other developers about Software Architecture and Design and access to source code for any working demo application I post on my blog or YouTube. Check out my Patreon or YouTube Membership for more info.

The post Vertical Slice Architecture Myths You Need To Know! appeared first on CodeOpinion.

Read the whole story
alvinashcraft
1 hour ago
reply
West Grove, PA
Share this story
Delete

Apple’s new AI model hints at how AI could come to the iPhone

1 Share
An illustration of the Apple logo.
Illustration: The Verge

Apple has been quiet about its plans for generative AI, but with the release of new AI models today, it appears the company’s immediate ambitions lie firmly in the “make AI run locally on Apple devices” realm.

Researchers from Apple released OpenELM, a series of four very small language models on the Hugging Face model library, on Wednesday. Apple said on its Hugging Face model page that OpenELM, which stands for “Open-source Efficient Language Models,” performs very efficiently on text-related tasks like email writing. The models are open source and ready for developers to use.

It has four sizes: 270 million parameters; 450 million parameters; 1.1 billion parameters; and 3 billion...

Continue reading…

Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete

Leading in the era of AI: How Microsoft’s platform differentiation and Copilot empowerment are driving AI Transformation

1 Share

A Copilot on every desk, every device and across every role is core to Microsoft’s mission to empower every person and every organization on the planet to achieve more. By bolstering the services customers know and love with our Copilot capabilities across the Microsoft Cloud, we are enabling zero-shot innovation — the ability to effectively gain value out of the box — for their businesses to increase productivity, creativity and inclusive collaboration. ISVs, digital natives, startups and the rest of our partner ecosystem are leveraging our Copilot stack to build AI solutions that are reshaping business processes across industries. We continue co-innovating directly with customers on our open cloud platform to bend the curve on innovation by identifying AI design patterns; then integrating and applying data to solve their most pressing challenges faster and more efficiently than ever before. Underpinning it all is the need for a strong cybersecurity foundation, and the work we are doing to ensure this critical imperative is met. No one is immune from bad actors, and we remain committed to transparency and trust in our approach to protecting customer and partner data. By empowering organizations with industry-leading cloud and responsible AI solutions, paired with our focus on security, we are helping them unlock opportunities that deliver pragmatic business outcomes. I am proud of the work we have done this past quarter that exemplifies our approach to enabling AI Transformation, and the success of our customers and partners across industries who have embraced it.

Just this week we announced our strategic partnership with The Coca-Cola Company to accelerate AI Transformation enterprise-wide as well as across its global network of independent bottlers, and shared news of our partnership with Cognizant to drive enterprise AI adoption for millions of users. We are expanding our work with G42 to accelerate responsible AI innovation in the United Arab Emirates and beyond while accelerating digital transformation securely across the Middle East, Central Asia and Africa with expanded access to services and technologies. We are also building on our relationship with Cloud Software Group to bring joint cloud solutions and generative AI capabilities to more than 100 million people.

We continue to shape the future of industry alongside our customers to help them differentiate their businesses. At CES, we showcased how our customers and partners are innovating across the automotive and mobility industry with generative AI to solve deep business problems and create new opportunities, and revealed the Copilot key for Windows 11 PCs. At NRF, we shared new copilot templates to help retailers incorporate generative AI across the shopper journey while enhancing the experience for store associates and making AI implementation more accessible. Just recently at HIMSS, we announced Microsoft’s role as the technology enabling partner for the Trustworthy & Responsible AI Network, a consortium of healthcare leaders aimed at operationalizing responsible AI principles to improve the quality, safety and trustworthiness of AI in healthcare.

Air India plane in flight
Air India harnesses AI for operational excellence with Copilot for Microsoft 365.

Enabling pragmatic AI innovation with Copilot capabilities to deliver immediate value

Companies like Amadeus, AAMI, TotalEnergies and Cushman & Wakefield are leveraging Copilot for Microsoft 365 to enrich customer interactions and advance high-priority projects. Visa employees are transforming how they work to better serve the needs of their clients and Banca Transilvania is boosting efficiency and innovation while delivering the highest quality of customer service. Emirates NBD engineers are solving their most complex problems while remaining within their development environments, and 96% of early adopters at CommBank have shared that Copilot is making them more productive. Colombia-based Cenit is experiencing rapid adoption of Copilot to empower employees to be more creative, productive and collaborative. U.K.-based law firm Clifford Chance is helping employees automate daily tasks like meeting notes, email drafts and inbox management with AI and natural language processing, while Air India built a plugin to gain real-time access to flight and operations data to empower better decision-making.

With Copilot for Dynamics 365 Customer Insights, Northrop & Johnson achieved a 250% increase in charter bookings while helping team members save time, boost productivity and enrich the quality of engagement with their customers. Using Power Platform and Microsoft Copilot Studio, Cineplex is saving employees over 30,000 hours annually in manual processing times and reducing ticket refund handling time from 15 minutes to under a minute. Northern Trust is using Microsoft Copilot for Service to modernize its client relations organization, streamline employee processes and elevate the client experience. Teams at EPAM Systems are leveraging Microsoft Copilot for Sales to improve sales processes and better support its business strategies, while freeing up time to spend with customers.

Worker at Blue Yonder
Blue Yonder optimizes supply chain orchestration with Azure OpenAI Service.

Building AI design patterns that bend the curve on innovation and intelligently reason over data to solve business challenges

Icertis is leveraging Azure OpenAI Service and its proprietary AI models in conjunction with its extensive data lake to uncover cost savings, enhance compliance and reduce risk across millions of legal contracts. Insurance company FWD Group is using the service to enhance customer experiences and operations across its lines of business, and Singapore-based CapitaLand Investment has saved 1 million Singapore dollars and 10,000 work hours per year with data-driven AI models. With Azure OpenAI Service as the secure foundation for its supply chain platform, Blue Yonder is harnessing AI and machine learning to provide real-time decision making for businesses across 78 countries. Homes & Villas by Marriott Bonvoy and Publicis Sapient are making it easier for travelers to find the right vacation homes through natural language search powered by large language models within Azure OpenAI Service, and Miral is curating some of the most sought-after leisure, entertainment and tourism activities with a 24/7 AI-powered concierge. Using several AI models, Australian retailer Coles developed an edge computing platform that makes 1.6 billion informed predictions each day so customers can find exactly what they are looking for across its 850 stores.

With Microsoft Fabric, Melbourne Airport is synchronizing flight bookings and ground transportation data to project demand while maintaining efficiency, reliability and safety of its operations. Rail freight operator Aurizon is deriving better data insights from 400 sensor-equipped locomotives to enhance cost efficiency, scalability and predictive maintenance. Seair Exim Solutions is increasing export-import trade data ingestion speeds by 90% to empower its global shipping industry customers with improved insights more quickly with help from Mandelbulb Technologies, while Dener Motorsport is using Fabric and real-time analytics to help automotive engineers detect and resolve car issues within minutes — down from nearly half an hour. By hosting its data within Azure Kubernetes, Windstream is improving its custom GPT platform to help employees find the information they need faster across 100,000 indexed documents. To promote financial inclusion, Trusting Social focused on a strong data foundation paired with AI services to enable the development and deployment of solutions that assist more than 130 financial institutions across Asia. Sasfin Bank worked with Legal Interact to implement a unified document management system to analyze contract clauses and streamline legal operations and workflows using Azure Cognitive Services and Azure SQL. TomTom has deployed GitHub Copilot to its developers, resulting in 85% of the company’s engineers feeling more productive and 70% saying it enables them to focus on more satisfying work. Meesho — a leading online marketplace in India — is using a generative AI chat agent to increase inquiry resolution by 40% and customer satisfaction by 25%, while leveraging GitHub Copilot to streamline code delivery and testing for its 200+ developers.

Grupo Bimbo plant
Grupo Bimbo bakes in end-to-end data security and compliance with Microsoft Purview.

Strengthening cybersecurity foundations with industry-leading cloud and responsible AI solutions

Microsoft’s Secure Future Initiative (SFI) reflects our commitment to continually advancing the built-in security of our products and services. We recently announced the general availability of Microsoft Copilot for Security — the industry’s first generative AI solution designed to help security and IT professionals do their work with greater speed and accuracy. In our latest Copilot for Security economic study, security analysts using the Copilot were 22% faster and 7% more accurate across all tasks, with 97% saying they want to use it again. Our ability to offer this solution to customers is bolstered by more than 100 partners such as Capgemini, Cisco and Darktrace who are committed to working with us to deliver trusted, responsible AI solutions. In collaboration with TC1 Labs, Pacifico Seguros has become the first company in Latin America to implement Copilot for Security with a Zero Trust approach, strengthening its security defenses and confidently preparing the company for future security challenges. Security professionals at Enverus are saving hours per day on low-value tasks with Copilot, allowing them to focus on strategic work and help elevate the impact of their security team. Wipro is using the solution to help employees save time and act quickly to improve outcomes across its business.

Using the Microsoft Defender security stack, Del Monte Foods increased security visibility and coverage across its IT environment, reducing security risks by 50% and improving its security response and remediation capabilities. To streamline its security operations, Jones Lang LaSalle implemented Microsoft Defender for Endpoint with help from Tanium, reducing cybersecurity spending by 20% and enhancing protection for nearly 90,000 endpoints. Türkiye-based Demirören has adopted a Zero Trust security model by unifying its security solutions to protect essential business data and enhance threat detection and response capabilities. Paytronix, a provider of customized digital experiences for restaurant and convenience store guests, improved its security profile by seamlessly migrating data for 250 million user accounts to Azure without interruption. The Audi Group’s IT department is simplifying endpoint management and safeguarding its hybrid workforce across 12 countries with enhanced security using Microsoft Intune. To ensure robust data protection and seamless access for authorized personnel, SLT-MOBITEL is using Microsoft’s AI-powered security products to fortify end-user devices and services while simplifying information security and compliance. Oregon State University and Grupo Bimbo are utilizing Microsoft data security solutions to safeguard their environments and confidently prepare for their continued adoption and deployment of AI.

Embracing AI Transformation has become a priority for organizations around the world seeking to unlock AI opportunities. Whether helping our customers adopt and scale AI with our Copilot capabilities or building unique AI strategies leveraging our open platform and partner ecosystem, we are humbled to work with you to drive pragmatic business outcomes and differentiated industry leadership. Our commitment to your success is unwavering. We will continue building upon our trusted relationships and our focus on responsibility and security so you can move forward with confidence in your AI journey. I remain inspired by what we have done — and what we will do — to help our customers and partners achieve more.

The post Leading in the era of AI: How Microsoft’s platform differentiation and Copilot empowerment are driving AI Transformation appeared first on The Official Microsoft Blog.

Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete
Next Page of Stories