As your team works together you’ll undoubtedly have files that you’ll want to share and collaborate on. Teams makes it easy to share files and work on them together. If working in Word, Excel, PowerPoint, or Visio files your colleagues can even view, edit, and collaborate on them, right within Teams.
Your files library
Within each team there are channels. Think of these channels as a place for your entire team to discuss a specific topic, like upcoming training, or project updates. Each channel has its own file folder where you can share files for that specific channel. To access that folder go to the channel and select the Files tab above the conversation window.
Note: Make sure to use the Files tab at the top of the channel conversation window instead of the Files button on the left side of the app. That button gives you access to ALL of your cloud files, rather than just the files for a specific channel.
In the library you can upload existing files or create new ones. When you upload a file, it creates a copy in Teams.
Upload existing files
There are two primary ways to upload existing files into your library. Both methods start by going to the files folder for your channel and both methods, by default, upload copies of your files to the channel file folder.
Drag and drop – Using your mouse, drag the file from where it’s currently located and drop it on the Teams window among the files.
Upload – Select Upload,then select the file (or files) you’d like to upload, and select Open.
Any files you upload will be accessible by any member of your team. And just like in SharePoint, you can pin specific files to the top of your list for easy access.
Create a file
To create a Word, PowerPoint, or Excel document in Teams, select New, then select the kind of file you’d like to create. The new file will open in Teams so you can begin editing it, and if you’d prefer to work in the desktop version of the app, select Open in Desktop App at the top of the app, in the middle of the ribbon.
As soon as it’s created your new file will be available for your team members to edit as well.
Co-edit a file
Files uploaded and shared to a team are accessible to every member of the Team. In Office for the web, Office for Android or iOS, or the latest applications from Microsoft 365, you and your team members can co-edit Word, PowerPoint, or Excel documents, or comment on Visio files. No need to check files out or worry about if one of your colleagues has the document open. Just open the file you need to edit and if other team members are editing it too your changes will be seamlessly merged as you work.
Share with people outside your team
If the file is a Word, Excel, PowerPoint or Visio file, the easiest way to share it is to open the file in its corresponding Office for the web or desktop app. Select Share at the top right corner of the window. From there enter the aliases or email addresses of people you want to email the file link, or select Copy Link to choose where you share the link.
Important: To share with someone outside your team you’ll have to open the file in Office for the web or Office desktop app.
If you want to share with people outside your organization you may have to select the permissions drop-down (right above where you add the people you want to share with) and select that Specific People can access the file.
Once you’ve shared the file, those users can edit the file in real time (unless you unchecked Allow editing in the Link Settings dialog) just like the other members of your team can.
Part 2 in a multi part series on Azure DevOps pipelines
Welcome to the second post in the series Creating a Multi-Stage Pipeline in Azure DevOps! In the last post we started creating a yaml based pipeline and set up the build. From the end of the last post there are two paths that can be taken to start deploying code – the Releases UI in Azure DevOps or continuing to add stages to the yaml. Both are viable options and we have been using Releases since its creation, however, in this post we are focusing on keeping the pipeline in code.
At the end of this post we will have the packaged code created from the build deployed to two different app services (we will call them Staging and Production) and appropriate dependencies between stages. Additionally, we will set a pre-deployment approval check before deploying to the Production infrastructure.
Repository – Any Git repository can be used and connected to Azure Pipelines but this walkthrough will utilize Azure Repos Git repository
IDE – This walkthrough was created using Visual Studio Code which has extensions for Pipeline syntax highlighting
We will be continuing with the .Net Core API project and pipeline started in the last series. You can follow along in the first post and then pick up from there or grab the code from the branch ‘post1-build’ as a starting point for this post (https://github.com/cashewshideout/blog-azurepipeline/tree/post1-build). It is not necessary to have previous knowledge of .Net Core for this walkthrough; the concepts of creating the Pipeline are universal between all supported languages.
This is the tentative list of planned posts in the series. Links and list will be updated as posts are published.
Preparation – Azure Infrastructure and Azure DevOps Service Connection
In order to deploy the code we will need a place to host it. For this post we will be using Azure App Services. There is a free tier for App Service Plans so no cost will be accrued for this walkthrough.
There are multiple ways to get these resources set up so go ahead and use your preferred method. I’ll outline a few steps to get them set up in Visual Studio Code. The resources we need are an App Service Plan with two App Services (one for staging and one for production).
In the Project Settings select ‘Service connections’
Create a new service Connection
Select ‘Azure Resource Manager’
Select ‘Service principal (automatic)’ for the Authentication method
Select appropriate Subscription and fill out details
Make sure ‘Grant access permission to all pipelines’ is selected and Save
Pipeline – First Look at Deployment Stage
Phew, now with that setup out of the way we can get back to setting up the Pipeline! Our first priority is getting the code to the staging instance. Before adding new code let’s refresh on what looks like currently – take a look at the gist in GitHub:
A pipeline is a collection of stages. Stages can run sequentially or in parallel depending on how you set dependencies up (more on dependencies later). Jobs in a stage all run in parallel and tasks within a job run sequentially.
Running jobs in parallel The applications we work on at MercuryWorks all have functional tests and infrastructure as code which need their own package of files to be sent to the Release. In the build stage we end up having three different jobs – one to build and create the application artifact, one to build and create the functional test artifact, and one to create the infrastructure artifact. They all run in parallel which reduces the overall time to complete the stage.
Right now, we only have one stage for the build with the last step creating an artifact of the built code. The tasks to deploy this code to the staging infrastructure will be in a separate stage (I guess technically everything could be in one stage but that would be pretty overwhelming to try to understand and debug).
This stage will have a few new concepts compared to the build. Let’s take a look at what the stage looks like – don’t panic – we will walk through all of the new settings. Here’s the next gist:
deployment (line 8) – The first major difference from the build stage is instead of a job listed under jobs it is instead named deployment. This is a specially named job that allows for additional options than a standard job type including deployment history and deployment strategies.
environment (line 12) – A bit further down there is a property named environment. This is set to ‘Staging’ because that is what we are naming this environment and in the deployment stage to the production instance it will be named ‘Production’. These environments can be named according to your own environment naming strategy. We will be going over Environments and what setting this property allows us to do.
strategy (line 13) – The strategy section has a variety of life cycle hooks (they are special named jobs) that can be used in different deployment strategies. You can find a description of all available options here – https://docs.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies. For this walkthrough we are using the simplest strategy of RunOnce. In RunOnce, each of the life cycle hooks are executed once and then depending on the result an on: success or on: failure hook is run. Our application is very simple so we only use the deploy hook.
steps (line 16) – Each life cycle hook has their own set of steps to execute. At this point things should look familiar outside of the specific tasks being used. First we want to extract the files from the zip that was created in the build, then the files will be deployed to an Azure App Service. We are deploying a .Net Core application here but they deploy task can be also be used for applications built in PHP, Node.js and a few other languages.
Reviewing the task you should notice that the package locations in the extract files task and the package in the deploy step are not filled in yet. In the last post we set up the build which created an artifact that needs to be referenced here. Let’s add three more lines and fill in the package location details.
dependsOn (line 7) – This is an array of stages that this stage should verify have successfully completed before running. Using this array on each stage will help arrange the pipeline to run exactly in the order you need. The deployment stage just added should not run before, or in parallel with the Build stage because it needs the artifact created. Note that this needs to match the name set to the stage: property, not the display name.
download (line 18-19) – This is a special named task that will download artifacts created from previous stages. It is noted that we want artifacts from the current context – the run that is currently happening, not a previous run. The artifact specified to download is the one created in the Build stage (it was named ‘app’).
archiveFilePatterns/destinationFolder (line 27 – 28) – Now we can tell this task where to find the zip file. The location where artifacts are downloaded to is contained in the variable $(Pipeline.Workspace). The folder structure was defined in the build and we can refresh our memory of it by reviewing the artifacts created from the last build. I generally like to extract files to a new directory so we specified a files folder.
Package (line 32) – The Dot Net Core publish task put all of the files inside a folder named the same as the project which is why there is the extra folder inside the files folder here. To check the exact file structure of the zip file that was created, the artifact can be downloaded from the above view.
Deploy to Staging
There are still a couple things to walkthrough but the pipeline is at a point now where we can test it out. Here is what the full pipeline should look like now. Let’s commit the updates and watch it run. Here’s a gist:
Checking on the build, there are some UI changes now that the second stage has been added.
Clicking into the pipeline it now shows both stages. Notice the ‘Build’ stage which indicates that it has 1 job (0/1 completed as it is currently running). Within the stage is the Application Build job. If there were more jobs within the stage they would be listed here.
If you do not see the job list, hover over the stage and click on the up/down arrow symbol that will show up in the top right corner of the box. Clicking into a job will give a further break down of each task and logs.
Once the pipeline has completed head on over to your site! The endpoint for this will be .azurewebsites.net/weatherforecast. This sample application has no endpoint at the root level.
Production Environment Deployment
The final stage needed in the pipeline is to deploy to the production App Service that was created. It will be pretty similar to the previous stage we created with a couple exceptions:
Make sure that the stage and job names are all updated to indicate they are for Production as well as the name of the web app being deployed to.
One place I want to point out is the dependsOn section. In this stage it has been updated to indicate a dependency on the build stage – because it needs the artifacts, as well as the Staging stage. We don’t want production being released before (or even at the same time as staging).
For a quick demonstration, this is what the pipeline would look like in Azure DevOps if the Production stage only had a dependency on the Build stage (dependsOn: [‘Build_Stage’]).
Notice that the dependency lines show that both Staging and Production will run at the same time afte the Build stage has completed? Instead, let’s make sure that the Production stage has all of the proper dependencies and commit the code. Here’s a gist:
Congratulations! Your application has been deployed to all environments.
Before we celebrate too much there is one last thing we need to do. If you watched the pipeline run you would have noticed that the Production stage immediately ran after the Staging stage. While some projects may be able to do that with an appropriate number of tests, most of the time we prefer to have an approval step in between stages.
We use the Staging environment as a way to demo new functionality to clients and like to have a bit more planning around when new code is deployed.
This is where Environments come in – we had touched on it briefly when looking at the deployment stage. It is more than just a nice way in the pipeline code to indicate what environment that stage is for.
In Azure DevOps under the Pipelines menu item in the navigation there is a section named Environments. After clicking on this, you will see that there are already some environments listed. These were automatically created when the environment property was added to the pipeline script.
This is a nice, quick way to determine what version of the application is deployed to each environment and what pipeline run it is related to.
Another benefit of defining environments is the ability to set approval gates. When in a specific environment click on the three-dot menu in the top right and select ‘Approvals and checks’.
There are multiple types of checks that can be set before an environment (some will be familiar to those of you familiar with approval gates in the classic Release UI). We are only going to be adding an approval for this pipeline so go ahead and select ‘Approvals’. On this form you can add specific users and/or groups to the list of Approvers. Fill out the approvers and click ‘Create’.
Head back to the pipeline and select ‘Run pipeline’ in the top right. Leave the default options, select ‘Run’ and let the pipeline run. Once Staging completes, you should now see Production marked as ‘Waiting’ and the person you set as an approver should have received and email. Logging in as the Approver there will be a Review button above the pipeline flow.
Clicking into Review, the Approver can ‘Approve’ or ‘Reject’ the deployment and add an optional comment.
Once approved the Production will run as normal. Final congratulations! You now have a full pipeline in YAML with multiple environments and approvers.
This should get you started on creating YAML pipelines in Azure DevOps. The next post will be some additional tricks and tips to help streamline creating your pipeline. There are many ways to customize these pipelines so don’t be surprised in seeing various posts come up that extend what was started here.
If you would like your application started or switched to using Azure DevOps Pipelines, contact us and let’s see how we can help!
In many organizations, there’s a constant stream of information moving through Teams at any time. It can be difficult to track down an important message you want to reference later on. The solution – bookmarked content.
Microsoft Teams has the ability to easily bookmark specific pieces of content, whether it’s a message, mention or an attachment. Bookmarking your content helps you to cut through the noise and find the data you need quickly.
Learn how to bookmark your content with the steps below:
Click the ellipses next to the content you want to bookmark
Click ‘Save this message’
To access your saved messages, click your picture in the upper right hand corner of Teams
Then click ‘Saved’
This will open all of the bookmarked content you have saved to your account.
You can also access your saved messages by using commands in the toolbar located at the top of Teams. Simply type in /saved and hit enter to open your saved content