-
Notifications
You must be signed in to change notification settings - Fork 4
Home
Did you already read the README for the repository? Perhaps you want to go for the quick start instead of this overview.
Beaver is a flexible build/configure/deploy system for ASP.NET projects based in PowerShell. It is designed to simplify the task of maintaining builds that may span many target environments (such as dev, QA, production) where each has its own slightly different configuration requirements.
Everything in Beaver is based on a consistent, simple idea: the "pipeline." Pipelines are simply a folder that contain "buildlet" files that perform tasks in alphabetical order. Their structure is quite similar to POSIX System V-style init scripts, only instead of a pipeline per runlevel there's a pipeline per build stage. Buildlets can be written in several domain-specific languages (PowerShell, XDT Transform, MSBuild, or even sub-pipelines), and you can implement your own buildlet provider if you wish. Using this architecture allows you to implement small, easily digestable, single-responsibility units of action - as well as gain an immediate understanding of how the build process works by simply looking in a few folders and seeing visually the order things run in and their descriptions.
Pipelines are simply folders containing a set of files that are buildlets. Each buildlet is evaluated in order when the pipeline is executed. Buildlets can be in any format you wish - each file extension just needs a Pipeline Provider implemented to support it.
The pipeline pattern is a familiar one. Linux, for example, uses it in its init scripts (e.g. /etc/rc_x_.d). Sitecore uses .NET-based pipelines for all sorts of procedural processes. Utilizing it for a scripting process allows you to focus on writing small, single-purpose pipeline step scripts that are easy to maintain. For new developers, being able to review the pipelines quickly at a filesystem level gives them quick insight into how the build process works.
Pipelines are extremely flexible; you don't just have to use them for deploying .NET projects.
Pipeline items can be interchangeably written in PowerShell, MSBuild, XDT (Microsoft Config Transforms), or Overlay (replaces a file). Or, you can implement your own Pipeline Provider to use any scripting language you can imagine. PowerShell variable scopes are transferred where possible, so you can use your variables as properties in MSBuild or XDT files.
Almost all projects will be deployed to more than one environment, such as Dev, QA, and Live. Beyond that sometimes a project may have a fork in its configuration for a different server role, such as Content Management, Content Delivery, or Indexing Server. These types of scenarios are all supported with a minimum of repeated deployment code.
Use archetypes to define common deployment features, such as "production," "content delivery," or "disable custom errors" that cross-cut environments.
Then, use environments to define the archetypes - and environment-specific extensions - that a specific deployment environment needs. For example, a live server might use the "production" and "content delivery" archetypes, and define a specific set of servers that environment should deploy to after it completes successfully.
Archetypes and environments are both defined using the same format; an environment simply "inherits" the archetypes' pipelines just as any build inherits the global pipelines.
Pipelines may contain other pipelines, down to any practical depth. Have a complex sub-process that involves several file copies and a config transform? Group it in a sub-pipeline.