Principal Software Engineer at Allscipts in Malvern, Pennsylvania, Microsoft Windows Dev MVP, Husband, Dad and Geek.
47998 stories
·
22 followers

Visual Studio App Center Data’s Completely Revamped Portal Experience

1 Share

Over the past few weeks, the Visual Studio App Center team has been hard at work adding new functionality to make managing and syncing your data much easier. Recently, we shipped an entirely new portal experience for App Center Data, which I’ll detail in this post.

If you’re unfamiliar with App Center Data, a part of our new Mobile-Backend-as-a-Service (MBaaS) offering, it enables you to easily scale, manage, and sync your data in the cloud for both online and “offline-first” apps. We’ve partnered with Cosmos DB, Microsoft’s globally distributed multi-model database, to give you the ability to create a scalable and reliable backend for your app in a few quick steps. If you’re looking to learn more about App Center Data and how it works, see our previous post Visual Studio App Center: Changing the Way you Handle App Data.

Within the App Center portal, App Center Data provides an easy way to provision and connect existing Cosmos DB databases, which simplified the process to get up and running quickly. Over the past few weeks, the App Center team has made two HUGE additions to the portal, which puts even more power in your hands. These updates are the Metrics View and the Data Explorer.

Metrics View

New Data portal

With our all new Metrics view, you can hone in and see just how your backend is doing. Various metrics associated with your app are now easily viewable directly from the App Center portal. In addition to having several powerful metrics at the tip of your fingers, this page also provides insight on your throughput, which you can use to get an accurate reading on your billing.

Data Explorer

Data Explorer

The App Center Data Explorer allows you to engage directly with and manage your data in the App Center portal. You have complete control over your Data. You can create, read, update, and delete documents directly in App Center. Previously, this was only available through the App Center Data client SDKs and the Azure portal. You can also easily see the distinction between your public and private documents for easy navigation amongst your data. You can even filter your data to key in on the document or documents of your choice. The possibilities are endless in our new Data Explorer.

These updates are live today so go to the App Center website to get started!

 

The post Visual Studio App Center Data’s Completely Revamped Portal Experience appeared first on App Center Blog.

Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete

Node.js 12 to LTS and Node.js 13 is here!

1 Share

This blog was written by Michael Dawson, with additional contributions from the Node.js Community Committee and the Node.js Technical Steering Committee.

We are excited to announce that Node.js 12 was promoted to Long Term Support(LTS) yesterday and that Node.js 13 was released today.

These releases deliver faster startup and better default heap limits, updates to V8, TLS, llhttp, new features including diagnostic report, bundled heap dump capability and updates to Worker Threads, N-API, and more.

Node.js 12 becomes the newest LTS release along with 10 and 8. Please note that 8 goes end-of-life in December (an exception to the regular LTS cycle due to the EOL of OpenSSL 1.02 so you should already be planning your migration off it to either 10 or 12.) For a reminder of the new features in Node.js please check out https://medium.com/@nodejs/introducing-node-js-12-76c41a1b3f3f

The Node.js 13 release replaces version 12 in our ‘current’ release line. This release won’t be promoted to LTS so we don’t recommend it for production use but it is still useful to build and test with this version periodically. It will let you test out the latest features and ensure that your package and applications will be compatible with future versions.

V8 Gets an Upgrade: V8 update to V8 7.8

As always a new version of the V8 JavaScript engine brings performance tweaks and improvements as well as keeping Node.js up with the ongoing improvements in the language and runtime. Read more about V8 at their official blog.

Full ICU enabled by default.

Internationalization via the International Components for Unicode (ICU) has been supported by Node.js for a long time. It helps developers write code that can support users in multiple languages and locales. Up until Node.js 13, however, only English was enabled by default. This meant that for many deployments, additional steps were required to get and enable support for the target locales. See https://nodejs.org/api/intl.html#intl_providing_icu_data_at_runtime for more details. As of Node.js 13, full-icuis now the default, meaning that hundreds of locales are supported out of the box. This should simplify development and deployment of applications for non-english deployments.

Stable Workers API

While making the Workers API stable was backported to Node.js 12, we still feel it is important to call this out as a significant milestone in the Node.js 13 timeframe. Worker Threads (https://nodejs.org/api/worker_threads.html) is now a stable feature in both Node.js 12 and 13. While Node.js already performs well with the single-threaded event loop, there are some use-cases where additional threads can be leveraged for better results. We’d like you to try them out and let us know what use cases you have where they are helpful. For a quick introduction check out this great article: https://medium.com/@Trott/using-worker-threads-in-node-js-80494136dbb6.

New compiler and platform minimums

Node.js and V8 continue to embrace newer C++ features and take advantage of newer compiler optimizations and security enhancements.

With the release of Node.js 13, the codebase now requires a minimum of 10 for the OSX development tools and version 7.2 of the AIX operating system.

In addition, progress on supporting Python3 for building Node.js has been made. Systems that have Python 2 and Python 3 installed will still use Python 2, however, systems with only Python 3 should now be able to build using Python 3. If you encounter any problems please let us know.

Further details are available in the Node.js: https://github.com/nodejs/node/blob/master/BUILDING.md#platform-list

Thank you!
A big thank you to everyone who made this release come together, whether you submitted a pull request, helped with our benchmarking efforts, or you were in charge of one of the release versions. We’d also like to thank the Node.js Build Working Group for ensuring we have the infrastructure to create and test releases.

The release manager for Node.js 13 is Bethany Griggs. The release manager for promoting Node.js 12 to LTS was Michaël Zasso. For a full list of the release team members head here. You can read more about the complete list of features here.

If you are interested in contributing to Node.js, we welcome you. Learn more via our contributor guidelines.

Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete

Electron 7.0.0

1 Share

Electron 7.0.0 has been released! It includes upgrades to Chromium 78, V8 7.8, and Node.js 12.8.1. We've added a Window on Arm 64 release, faster IPC methods, a new nativeTheme API, and much more!


The Electron team is excited to announce the release of Electron 7.0.0! You can install it with npm via npm install electron@latest or download it from our releases website. The release is packed with upgrades, fixes, and new features. We can't wait to see what you build with them! Continue reading for details about this release, and please share any feedback you have!

Notable Changes

  • Stack Upgrades:

    Stack Electron 7 What's New Electron 6
    Chromium 78.0.3905.1 78, 77 76.0.3809.146
    V8 7.8 7.8, 7.7 7.6
    Node.js 12.8.1 12.8.1, 12.8, 12.7, 12.6, 12.5 12.4.0
  • Added Windows on Arm (64 bit) release. #18591, #20112

  • Added ipcRenderer.invoke() and ipcMain.handle() for asynchronous request/response-style IPC. These are strongly recommended over the remote module. See this "Electron’s ‘remote’ module considered harmful" blog post for more information. #18449

  • Added nativeTheme API to read and respond to changes in the OS's theme and color scheme. #19758, #20486

  • Switched to a new TypeScript Definitions generator. The resulting definitions are more precise; so if your TypeScript build fails, this is the likely cause. #18103

See the 7.0.0 release notes for a longer list of changes.

Breaking Changes

More information about these and future changes can be found on the Planned Breaking Changes page.

  • Removed deprecated APIs:
    • Callback-based versions of functions that now use Promises. #17907
    • Tray.setHighlightMode() (macOS). #18981
    • app.enableMixedSandbox() #17894
    • app.getApplicationMenu(),
    • app.setApplicationMenu(),
    • powerMonitor.querySystemIdleState(),
    • powerMonitor.querySystemIdleTime(),
    • webFrame.setIsolatedWorldContentSecurityPolicy(),
    • webFrame.setIsolatedWorldHumanReadableName(),
    • webFrame.setIsolatedWorldSecurityOrigin() #18159
  • Session.clearAuthCache() no longer allows filtering the cleared cache entries. #17970
  • Native interfaces on macOS (menus, dialogs, etc.) now automatically match the dark mode setting on the user's machine. #19226
  • Updated the electron module to use @electron/get. The minimum supported node version is now Node 8. #18413
  • The file electron.asar no longer exists. Any packaging scripts that depend on its existence should be updated. #18577

End of Support for 4.x.y

Electron 4.x.y has reached end-of-support as per the project's support policy. Developers and applications are encouraged to upgrade to a newer version of Electron.

App Feedback Program

We continue to use our App Feedback Program for testing. Projects who participate in this program test Electron betas on their apps; and in return, the new bugs they find are prioritized for the stable release. If you'd like to participate or learn more, check out our blog post about the program.

What's Next

In the short term, you can expect the team to continue to focus on keeping up with the development of the major components that make up Electron, including Chromium, Node, and V8. Although we are careful not to make promises about release dates, our plan is release new major versions of Electron with new versions of those components approximately quarterly. The tentative 8.0.0 schedule maps out key dates in the Electron 8 development life cycle. Also, see our versioning document for more detailed information about versioning in Electron.

For information on planned breaking changes in upcoming versions of Electron, see our Planned Breaking Changes doc.

Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete

Write your own Desired State Configuration (DSC) module

1 Share

Octopus learning how to write a custom PowerShell DSC Module

As you gain experience with PowerShell Desired State Configuration (DSC) you might encounter situations where the available modules don’t quite fit what you want to do. You could write your own Script Resources, but they don’t scale well, passing parameters is difficult, and they don’t provide a method for encryption, leaving passwords in clear text, however, you can write your own DSC modules.

A tool to help you write your module

Writing your own module isn’t really that hard. The most difficult part is getting the files and folders in the correct locations because DSC is quite specific about what goes where. However, Microsoft recognizes this can be quite frustrating and has developed xDscResourceDesigner, a PowerShell module to help you get started. Using this module, you can easily define what properties your resource needs and it will generate the entire module structure for you, including the MOF schema file. If you’re a first-timer, I highly recommend using this module, it could save you quite a bit of frustration (take it from me).

Installing xDscResourceDesigner

Installing the module is no different from installing any other module onto your system:

Install-Module -Name xDscResourceDesigner

As this Microsoft article points out, if you have a version of PowerShell prior to version 5, you may need to install the PowerShellGet module for installation to work.

Using xDscResourceDesigner

Using the xDscResourceDesigner is actually pretty easy, there are only two functions: New-DscResourceProperty and New-xDscResource. New-DscResourceProperty is what you use to define the properties of your DSC resource. After you’ve done that, you send that information to the New-xDscResource function, and it generates everything you need to implement your resource:

# Import the module for use
Import-Module -Name xDscResourceDesigner

# Define properties
$property1 = New-xDscResourceProperty -Name Property1 -Type String -Attribute Key
$property2 = New-xDscResourceProperty -Name Property2 -Type PSCredential -Attribute Write
$property3 = New-xDscResourceProperty -Name Property3 -Type String -Attribute Required -ValidateSet "Present", "Absent"

# Create my DSC Resource
New-xDscResource -Name DemoResource1 -Property $property1, $property2, $property3 -Path 'c:\Program Files\WindowsPowerShell\Modules' -ModuleName DemoModule

And there you have it, your very own DSC module with all the stubs generated.

Understanding the resource Attribute property

For the Attribute component of a Resource Property within DSC, there are four possible values:

  • Key
  • Read
  • Required
  • Write

Key

Every node that uses your resource must have a key that makes the node unique. Similar to database tables, this key doesn’t need to be a single property, but it can be made up of several properties, each bearing the Key attribute. In the example above, Property1 is our Key for the resource. However, it could also be done this way:

# Define properties
$property1 = New-xDscResourceProperty -Name Property1 -Type String -Attribute Key
$property2 = New-xDscResourceProperty -Name Property2 -Type PSCredential -Attribute Write
$property3 = New-xDscResourceProperty -Name Property3 -Type String -Attribute Required -ValidateSet "Present", "Absent"
$property4 = New-xDscResourceProperty -Name Property4 -Type String -Attribute Key
$property5 = New-xDscResourceProperty -Name Property5 -Type String -Attribute Key

In this example, property1, property4, and property5 are what make up the unique value for the node. Key attributes are always writable and are required.

Read

Read attributes are read-only and cannot have values assigned to them.

Required

Required attributes are assignable properties that must be specified when declaring the configuration. Using our example from above when we created our resource, the Property3 property is set to be required.

Write

Write attributes are optional attributes that you specify a value to when defining the node. In the example, Property2 is defined as a Write attribute.

ValidateSet switch

The ValidateSet switch is something that can be used with Key or Write attributes that specify the allowable values for a given property. In our example, we’ve specified that Property3 can only be either Absent or Present. Any other value will result in an error.

DSC module file and folder structure

Whether you decided to do it yourself or use the tool, the folder and file structure will look like the following:

$env:ProgramFiles\WindowsPowerShell\Modules (folder)
    |- DemoModule (folder)
        |- DSCResources (folder)
            |- DemoResource1 (folder)
                |- DemoResource1.psd1 (file, optional)
                |- DemoResource1.psm1 (file, required)
                |- DemoResource1.schema.mof (file, required)

The MOF file

MOF stands for Managed Object Format and is the language used to describe Common Information Model (CIM) classes. Using the example from the Tool to help you write your module section, the resulting MOF file will look like this:

[ClassVersion("1.0.0.0"), FriendlyName("DemoResource1")]
class DemoResource1 : OMI_BaseResource
{
    [Key] String Property1;
    [Write, EmbeddedInstance("MSFT_Credential")] String Property2;
    [Required, ValueMap{"Present","Absent"}, Values{"Present","Absent"}] String Property3;
};

The MOF file will only contain the properties we will in our module, along with their attributes and data types. Unless we’re adding or removing properties, this is pretty much all we do with the MOF file.

the psm1 file

The psm1 file is where the bulk of our code is going to be. This file will contain three required functions:

  • Get-TargetResource
  • Test-TargetResource
  • Set-TargetResource

Get-TargetResource

The Get-TargetResource function returns the current value(s) of what the resource is responsible for. Our stubbed function from using xDscResourceDesigner looks like the following:

function Get-TargetResource
{
    [CmdletBinding()]
    [OutputType([System.Collections.Hashtable])]
    param
    (
        [parameter(Mandatory = $true)]
        [System.String]
        $Property1,

        [parameter(Mandatory = $true)]
        [ValidateSet("Present","Absent")]
        [System.String]
        $Property3
    )

    #Write-Verbose "Use this cmdlet to deliver information about command processing."

    #Write-Debug "Use this cmdlet to write debug information while troubleshooting."


    <#
    $returnValue = @{
    Property1 = [System.String]
    Property2 = [System.Management.Automation.PSCredential]
    Property3 = [System.String]
    }

    $returnValue
    #>
}

Note that the optional parameter (Write attribute) Property2 is not required for this function.

Test-TargetResource

The Test-TargetResource function returns a boolean value indicating whether or not the resource is in the desired state. From our generated example, the function looks like this:

function Test-TargetResource
{
    [CmdletBinding()]
    [OutputType([System.Boolean])]
    param
    (
        [parameter(Mandatory = $true)]
        [System.String]
        $Property1,

        [System.Management.Automation.PSCredential]
        $Property2,

        [parameter(Mandatory = $true)]
        [ValidateSet("Present","Absent")]
        [System.String]
        $Property3
    )

    #Write-Verbose "Use this cmdlet to deliver information about command processing."

    #Write-Debug "Use this cmdlet to write debug information while troubleshooting."


    <#
    $result = [System.Boolean]

    $result
    #>
}

Set-TargetResource

The Set-TargetResource function is used to configure the resource to the specified desired state. Our generated example looks like this:

function Set-TargetResource
{
    [CmdletBinding()]
    param
    (
        [parameter(Mandatory = $true)]
        [System.String]
        $Property1,

        [System.Management.Automation.PSCredential]
        $Property2,

        [parameter(Mandatory = $true)]
        [ValidateSet("Present","Absent")]
        [System.String]
        $Property3
    )

    #Write-Verbose "Use this cmdlet to deliver information about command processing."

    #Write-Debug "Use this cmdlet to write debug information while troubleshooting."

    #Include this line if the resource requires a system reboot.
    #$global:DSCMachineStatus = 1
}

Summary

Whether simplistic or complex, the steps for creating your own DSC module will be the same. This post is aimed at getting you started in the right direction. From here, you can create your module to fit whatever resource you need to configure and keep in a desired state. For the full example of a working module check out xCertificatePermission on my GitHub repo.

Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete

Pulumi, with Joe Duffy

1 Share

Joe Duffy is the founder and CEO of Pulumi, an open-source cloud development platform. He joins Adam and Craig to explain why a general purpose programming language is a better tool for cloud infrastructure than a domain-specific language (or YAML), and how you can use Pulumi to provision cloud infrastructure and Kubernetes resources alike.

Do you have something cool to share? Some questions? Let us know:

Chatter of the week

News of the week





Download audio: https://kubernetespodcast.com/episodes/KPfGep076.mp3
Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete

STEM Everyday: #142 | Mr. Rogers of Technology | feat. Mista Pat (Pat Person)

1 Share









As a young kid growing up in South Central Los Angeles, Mista Pat was fascinated with science and technology. As an adult, he was inspired by his 4-year-old son, Logan, to co-found a company called Kids That Code, Inc. in San Bernardino, California that teaches young children computer programming and other STEAM (Science, Technology, Engineering, Arts, Math) related subjects. He realized that underrepresented and underprivileged children were fascinated with technology but not participating in the technology revolution. Mista Pat says “they can’t be what they can’t see!” To that end, Mista Pat is dedicated to inspire young children to one day become the amazing people who will create the companies, products, and services that are changing our world. 




Connect with Mista Pat



* Follow him on Twitter* Check out mistapat.org* Learn more about Kids That Code





Chris Woods is your host for the STEM Everyday Podcast  – HS Math Teacher, STEM Presenter & Podcaster, iBook creator “STEM is everywhere… Let me help you see it & add it to your classroom.”



* Follow Chris, DM him, just say “Hey.”* Check out Chris’s Website* Follow Chris on Twitter* Follow Chris on Instagram* Subscribe to Chris on Youtube




Do you receive the Daily STEM? A FREE one page weekly “newspaper” to help inspire kids to engage with STEM in the real world.



Head on over and don’t miss another newsletter http://dailystem.com/news/




Download audio: http://traffic.libsyn.com/remarkablechatter/STEM_Everyday_142_Pat_Person.mp3
Read the whole story
alvinashcraft
4 hours ago
reply
West Grove, PA
Share this story
Delete
Next Page of Stories