 <?xml-stylesheet type="text/css" href="http://mspbuilder.com/Data/style/rss1.css" ?> <?xml-stylesheet type="text/xsl" href="http://mspbuilder.com/Data/style/rss1.xsl" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
  <channel>
    <title>MSP Builder Blog</title>
    <link>http://mspbuilder.com/blog</link>
    <description />
    <docs>http://www.rssboard.org/rss-specification</docs>
    <generator>mojoPortal Blog Module</generator>
    <language>en-US</language>
    <ttl>120</ttl>
    <atom:link href="http://mspbuilder.com/Blog/RSS.aspx?p=3~3~4" rel="self" type="application/rss+xml" />
    <itunes:owner />
    <itunes:explicit>no</itunes:explicit>
    <item>
      <title>Endpoint Management - Service Level Automation</title>
      <description><![CDATA[<p>Many MSPs can benefit from offering different levels of service to their customers - it allows them to tailor their product&nbsp;to the size and budget of the organization they serve. The challenge is finding ways to automate this to deliver consistency without significant effort. Some common methods we've seen range from defining automation policies to run scripts to deploy software and linking these policies to each customer to just manually running the scripts needed to deploy the applications. The challenge with this - like all manual actions - is consistency. The RMM Suite solves this through Service Class Automation.</p>

<h5>Service Class Automation</h5>

<p>Just to clarify the term, "Service Class" (or Class of Service) usually assigns a name to the delivery of specific services. A good example is the classic Bronze / Silver / Gold terms, where Bronze might provide basic monitoring and AV while Gold provides advanced monitoring, proactive maintenance, and comprehensive endpoint security services. This can related to several services within an MSP practice, including monitoring, software,&nbsp;maintenance, patching, and security.</p>

<p>The RMM Suite employs a basic Service Class of Unmanaged and Managed, which is used broadly to apply or block automation.&nbsp;</p>

<p><strong>Unmanaged</strong>&nbsp;- This can be a "break/fix" or "time and materials" customer with no automation. RMM Suite customers also use this mode to onboard new clients. Since an unmanaged customer receives no automation, it allows a period of time after deploying agents to perform discovery actions. This can lead to preparing custom configurations, setting up software licenses for automated deployments, and identifying any special monitoring requirements. Once all customer preparation is completed, a client can be switched to Managed. This is defined using either a Customer Custom Field or - in VSA - a Machine Group root name.</p>

<p><strong>Managed -&nbsp;</strong>This represents a generic state where ALL automated services can be applied. The automation policies specifically look for the "unmanaged" status, treating all other status types as "managed". This allows a generic classification of "managed" as well as specific sub-classifications&nbsp;or Service Classes.&nbsp;The service classes can also be used to drive client billing.</p>

<p><strong>Service Classes</strong>&nbsp;- These are codes - whether colors, metals, animals, or simply an alpha-numeric ID - that define a specific set of services. These codes can be distinct or cumulative - that's completely up to the MSP. Cumulative codes take a bit more planning and configuration effort, but can simplify certain aspects of the automation.</p>

<h5>Distinct Code Mapping</h5>

<p>Distinct codes will map a set of specific components and services to a single code. A system filter identifies the code and applies the appropriate services. Note that the same services can be associated with multiple Service Class codes.</p>

<p class="text-indent-1"><strong>Iron</strong>&nbsp;- Basic AV, Patching</p>

<p class="text-indent-1"><strong>Steel</strong>&nbsp;- Basic AV, Antimalware, Patching, Application Updating, Basic Monitoring</p>

<p class="text-indent-1"><strong>Titanium</strong> - Advanced AV, Endpoint Security, Antimalware, Patching, Application Updating, Basic Monitoring, Advanced Monitoring</p>

<p>There are three automation policies and three filters. The filter checks for the Service Level code and applies the automation policy. The policy applies the products and services that are part of the Service Class. You will see that two policies have Basic AV, Antimalware, and Application Updating, three have Patching, and two have products unique to that class, This is a simple mapping of code to services and works well when there are a small set of&nbsp;classes and products.</p>

<h5>Cumulative Code Mapping</h5>

<p>This method creates a filter and automation policy for&nbsp;<em>each distinct product or service</em>&nbsp;instead of the service class. The filter applies a specific product or service when it matches one or more Service Class codes. This is how it works:</p>

<p class="text-indent-1"><strong>Basic AV</strong>&nbsp;- Filter triggers at Iron OR Steel levels</p>

<p class="text-indent-1"><strong>Advanced AV</strong>&nbsp;- Filter triggers at Titanium level</p>

<p class="text-indent-1"><strong>Antimalware </strong>- Filter triggers at Steel OR Titanium levels</p>

<p class="text-indent-1"><strong>Patching </strong>- Filter triggers at Iron OR Steel OR Titanium levels</p>

<p class="text-indent-1"><strong>Basic Monitoring</strong> - Filter triggers&nbsp;at Steel OR Titanium levels</p>

<p class="text-indent-1"><strong>Advanced&nbsp;Monitoring</strong> - Filter triggers&nbsp;at&nbsp;Titanium level</p>

<p>While this is certainly more complex and requires distinct filters and automation policies for each service, it provides greater flexibility when there are additional Service Classes. Consider adding a new "Tin" service class that only provides patching, and an "Aluminum" level with Patching and Application Updating. By simply updating the filter associated with the products to trigger on these new service classes, the automation applies without the need to create both new filters AND automation policies.&nbsp;</p>

<h5>How the RMM Suite uses Service Class Mapping</h5>

<p>Each day, when the Daily Audit application runs, it determines the Service Class code assigned to the customer. This starts by checking for a Customer Custom Field called CCOS. The value - if defined - is mapped to the "SC:<em>id</em>"&nbsp;tag and written to the System Roles Agent Custom Field, along with any other TAGs based on the applications and services found. The TAGs can be used to drive views to apply policies, which is useful for applying the monitors associated with these Service Classes. The TAG can also be used directly by the Daily Maintenance tool to install application components, either by local script or RMM script.</p>

<p>A second advantage of this method is the Service Class identity is added to a machine-specific field. Some RMMs do not expose the Customer Custom Fields to agent scripting and this circumvents that deficiency.</p>
<br /><a href='http://mspbuilder.com/blog-endpoint-management-service-level-automation'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-endpoint-management-service-level-automation'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-endpoint-management-service-level-automation</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/blog-endpoint-management-service-level-automation</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-endpoint-management-service-level-automation</guid>
      <pubDate>Tue, 15 Nov 2022 15:00:00 GMT</pubDate>
    </item>
    <item>
      <title>Endpoint Management - Maintaining Components</title>
      <description><![CDATA[<p>Maintaining custom applications can be challenging - you need to detect where the software is installed, which version is present, and then run the appropriate scripts to update the applications. While most RMM platforms can accomplish this, many can't easily identify these custom applications, and tying the detection, version identification, and update automation together isn't a trivial task. That's where the RMM Suite can help.</p>

<h5>Detecting the Application</h5>

<p>The RMM Suite provides a highly customizable Daily Audit feature that can identify applications either by registration with Add/Remove programs or as an installed service. The first step is to determine the best detection method.&nbsp;</p>

<h6>Application Name/Version Detection</h6>

<p>Start by examining the SysInfo.INI file from&nbsp;the agent where the software is installed. This is the Daily Audit cache file and is located in the PROGRAMDATA\MSPB folder.&nbsp;Many RMM platforms will collect this file and store it in the agent's data folder automatically when the Daily Audit completes. Open the file in a text editor such as Notepad and locate the SWINFO section. This section lists all of the application / version / vendor data reported by Windows. Determine if the application is listed -&nbsp;if it is -&nbsp;&nbsp;copy the application ID into the AUDIT.INI configuration in the APP VERSION ROLES section and define a unique TAG value. Tags should be 3-4 alpha-numeric values to identify an application and version.&nbsp; Note that multiple detections are possible - a check for "Accounting - 12.6" can map to "AA126" while a generic "Accounting - " can map to "ACCT". The generic tag can be used to detect ANY version while the specific can identify a particular version. Once this entry is added to the Audit config data, detections will begin during the next daily operational cycle.</p>

<h6>Service Detection</h6>

<p>A Windows service provides a simple and direct detection method. Start by identifying the name of the service by running "Net Start" in a command prompt on the endpoint where the application is running. Identify the correct service name and add it to the SERVICE ROLES section in the AUDIT.INI configuration data, assigning an appropriate TAG value. This detection will begin during the next daily operational cycle.&nbsp;</p>

<h5>Leveraging TAGs in Daily Maintenance</h5>

<p>Each task in Daily Maintenance can be tied to one or more System Role Tags. Actions can be taken when a tag is present or missing, and can be combined to require multiple related tags. Tasks can be used to directly upgrade software by looking for an outdated version TAG or can run multiple tasks to uninstall the existing version and then install the new version. The tasks can be local scripts or RMM Scripts run by API, or any combination. Daily Maintenance can directly unzip and execute packages deployed from an RMM script, or download a package hosted by MSP Builder. (MSP Builder packages utilize a security token to assure authenticity. We host your custom packages at no additional cost and add the security token when created or updated.)&nbsp;</p>

<p>Note that Daily Maintenance tasks are executed in the sequence that they are defined, so be sure to order them so an uninstall is run before the update process.&nbsp;</p>

<h5>Summary</h5>

<p>The RMM Suite automation provides a rapid and low-impact mechanism for application and endpoint maintenance with several significant advantages:</p>

<ul>
	<li>Tasks run every day on every endpoint, providing maximum delivery exposure.</li>
	<li>Daily Audit runs immediately prior to Daily Maintenance, identifying software and services.</li>
	<li>Daily Maintenance can leverage the TAGs set by the audit to update components&nbsp;that are vulnerable or outdated.</li>
	<li>Daily Audit runs again, after Daily Maintenance completes, reporting on the now-current application and component status. This also updates the tags, disabling further update operations.</li>
	<li>No additional RMM automation is needed - no Filters/Views, automation policies, or scheduling, significantly reducing the load and complexity on the RMM platform.</li>
</ul>

<p>&nbsp;</p>
<br /><a href='http://mspbuilder.com/endpoint-management-maintaining-components'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/endpoint-management-maintaining-components'>...</a>]]></description>
      <link>http://mspbuilder.com/endpoint-management-maintaining-components</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/endpoint-management-maintaining-components</comments>
      <guid isPermaLink="true">http://mspbuilder.com/endpoint-management-maintaining-components</guid>
      <pubDate>Thu, 20 Oct 2022 14:00:00 GMT</pubDate>
    </item>
    <item>
      <title>Onboarding Automation - Deploying New Components</title>
      <description><![CDATA[<p>Do you need to add new components to a standard configuration for one, several, or all customers?<br />
The RMM Suite Onboard Automation (OBA)&nbsp;tool will help you get this done through a single config file update.</p>

<h5>Using a Standard Configuration</h5>

<p>This concept starts by defining a set of configuration settings and software that needs to be deployed within your support stack. This should include settings and software that you deploy to most or all customers, then settings and software deployed to specific customers. The latter is usually LOB applications, as the RMM Suite install scripts can leverage Cloud Script Variables (CSVs) to both control the deployment globally while delivering customer-specific content.</p>

<p>As your product stack changes - either by adding or replacing products - your Standard Configuration changes. This has been a difficult process for many as configurations may need to change and software uninstalled before installing a new set of products. This is where the RMM Suite OBA tool can help.</p>

<h5>Deploying the Standard Configuration</h5>

<p>The OBA tool runs when an agent first checks in to deploy software and configure the endpoint to meet the Standard Configuration requirements. See this <a href="https://www.mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1" target="_blank">blog post</a> for&nbsp;full&nbsp;information on using the OBA Tool and the Standard Configuration.&nbsp;</p>

<h5>Dealing with Change</h5>

<p>Change is inevitable, but it should not be difficult! A typical change to a Standard Configuration is using a different product, such as Antivirus software. Let's assume your Standard Configuration utilized Iron-Man AV, but now you are switching to the more powerful Titanium-Man AV product. You need to remove the old product and then install the new one. This requires just 2 scripts and 3 changes to your OBA config file:</p>

<p>Script: <strong>Uninstall Iron-Man AV</strong> - Create an RMM script to uninstall the Iron-Man AV product, suppressing any reboots.</p>

<p>Script:&nbsp;<strong>Install Titanium-Man AV</strong>&nbsp;- Create an RMM script to install the new Titanium-Man AV product, suppressing any reboots</p>

<p>Change the OBA configuration file:</p>

<ul>
	<li>Disable or Remove the definition that installed the IronMan AV product</li>
	<li>Add a definition to run the <strong>Uninstall Iron-Man AV</strong> script</li>
	<li>Add a definition to run the <strong>Install Titanium-Man AV</strong> script</li>
</ul>

<p>Once these changes are in your OBA configuration file, the next daily cycle will discover that these two tasks have never been run and will run them on all endpoints (based on the Task Category where these are defined, of course). The next time the endpoint is online and runs the Daily Tasks, these changes to your Standard Configuration will be processed and the endpoint will be compliant with your new standards.</p>

<h5>Summary</h5>

<p>Despite the "Onboarding Automation" name, the capabilities of the OBA tool extend to helping you maintain a "Standard Configuration" without complex scripting or other RMM automation tools. The OBA tool also works hand in hand with the Daily Maintenance tool that can be used to deploy and update components on the endpoint, especially when using Customer Class of Service (CCOS) tags. Daily Maintenance&nbsp;could be used to deploy and maintain either Iron-Man AV or Titanium-Man AV based on the CCOS tag being "Iron" or "titanium" (or any other level-identification term).&nbsp;</p>
<br /><a href='http://mspbuilder.com/blog-onboarding-automation-deploying-new-components-1'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-onboarding-automation-deploying-new-components-1'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-onboarding-automation-deploying-new-components-1</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/blog-onboarding-automation-deploying-new-components-1</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-onboarding-automation-deploying-new-components-1</guid>
      <pubDate>Tue, 30 Aug 2022 14:00:00 GMT</pubDate>
    </item>
    <item>
      <title>Onboarding Automation - Hands-Off Workstation Build</title>
      <description><![CDATA[<p><em>Can you imagine deploying 60-70 new computers per day for multiple customers with just 1 or two techs?</em><br />
The RMM Suite and your RMM platform make this "childs play"!</p>

<p>The RMM Suite provides an Onboard Automation tool that can fully automate the ongoing deployment of endpoint software and configuration tasks. It's a powerful tool that takes just a few minutes of planning and setup to effectively leverage. Follow along as we walk through a typical setup scenario.</p>

<h5>General Concepts</h5>

<p>The&nbsp;Onboard Automation (OBA) tool runs when an agent first checks into the RMM platform. If an alarm is configured (recommended), then this happens within 2-3 minutes of the agent first being installed and checking in. Without the alarm, it can take up to 24-hours for the RMM to schedule the daily automation tasks, so using the alarm is essential if you are going to deploy many new systems from your tech bench.&nbsp;</p>

<p>The OBA tool consults a configuration file for a list of RMM scripts to run. It uses the APIs to run these scripts based on an agent's classification within several task categories. This initial run usually executes many scripts to deploy software and configure the endpoint, first for MSP tasks and then for customer-specific tasks. This provides a high degree of flexibility in a highly automated process.</p>

<p>Once the initial execution runs, the OBA tool continues to run each day, comparing the current task list with what has already been completed. If a new task is found, it is run and the process logged. This allows a "Standard Configuration" to be defined by the MSP for internal and client-specific settings that is maintained automatically.&nbsp;</p>

<h5>Task Categories</h5>

<p>There are several categories of tasks that the&nbsp;OBA&nbsp;tool uses to decide what it should do.&nbsp;There are three "general" categories that allow the MSP to perform their tasks, called "All Agents", "All Servers", and "All Workstations". Scripts defined in these categories run on all endpoints unless specifically excluded for a particular customer. This allows, for example, to deploy a configuration script to every workstation, but exclude a specific customer that needs an alternate setting. The concept allows tailoring an otherwise broadly deployed process.</p>

<p>Additional categories are next tied to specific customers, allowing execution of scripts to all agents, just servers, just workstations, or even workstations in a specific site location.&nbsp;</p>

<h5>Automation Controls</h5>

<p>The OBA configuration file simply defines each of the task categories, then lists the names of the scripts that should be executed. Each script has a control parameter associated with it - Yes | No | All - that controls how it will be run. "No" allows the script to remain in the config file but it is disabled and will be ignored. "Yes" will cause the script to be executed if the agent belongs to a "managed" group. This will skip any agent that is considered "unmanaged" (break/fix or not yet managed/onboarding stage). The "All" option allows the script to execute on all endpoints regardless of the managed/unmanaged status. This is especially useful for scripts that configure the endpoint or the RMM agent itself.</p>

<h5>Preparation And Planning</h5>

<p>Preparation mainly consists of creating the RMM scripts needed to perform the application installation and endpoint configuration processes. These should be created and tested before defining them in the OBA config file. Testing can be completed simply by manually executing the script from the RMM platform.</p>

<p>Planning requires an understanding of what processes should be performed globally, which customers should be excluded from global tasks, and then selecting the customer-specific tasks. Something to consider here - the RMM Suite app installers are often generic and employ a Cloud Script Variable to assign a license key or similar configuration setting. Customers that don't have a key will abort that script without a "failure", allowing this script to potentially be applied globally, yet executed only where a CSV value has been defined.&nbsp;</p>

<h5>Operation</h5>

<p>This is the easy part! Depending on your RMM, you may need to enable the "New Agent" alarm to allow the onboarding process to run immediately after the first check in. Everything from that point onward is automated. Simply maintain the OBA config file to add scripts as the standard configuration changes, knowing that these will run automatically the next time the Daily Tasks are run.</p>

<p>Note that you can define scripts that install, remove, or update endpoint components, but once a specific script has run successfully, it will NOT be run again. Updates can be deployed by including some specific text in the name such as "Update XXX to V3.45" or "uninstall XXX V3.0". The RMM Suite maintains this tracking in the "init" sub-key of our registry path, so an alternative would be to clear or remove the key if you need to run a task again.&nbsp;</p>
<br /><a href='http://mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-onboarding-automation-hands-off-workstation-build-1</guid>
      <pubDate>Mon, 15 Aug 2022 14:00:00 GMT</pubDate>
    </item>
    <item>
      <title>STOP! How outdated are your management scripts?</title>
      <description><![CDATA[<div>During a recent audit of an MSP's onboarding processes, I found several Agent Procedures that seemed interesting. I had not seen any other MSP performing some of these configuration steps, so I looked more deeply at the logic in these procedures. What I found would have turned any hair I had left white!</div>

<div>&nbsp;</div>

<div>One procedure in particular was named "Set Access Rights for PerfMon Folders". "What PerfMon folders?" I wondered.. Looking at the procedure, the description stated that it was modifying the Kaseya working folder permissions to allow PerfMon to access the KLogs folder. It did this by changing the permissions to "Everyone:Full Control"!&nbsp;</div>

<div>&nbsp;</div>

<div>Looking closer, I was able to determine that this procedure was quite old, and likely developed for VSA version 6 or earlier and had never been updated. While it's possible that older versions of VSA did not provide adequate access to the KWorking folder, that is no longer the case. Administrators have full control, and even users have Read &amp; Execute, so there is no issue with PerfMon reading this location.&nbsp;</div>

<div>&nbsp;</div>

<div>The most important thing to realize is that things change. If you have processes that haven't changed in years, it's time to afford them a review and decide if they are still needed, or in need of an update. This procedure, if not identified, would introduce significant risk into the MSP environment by granting Full Control rights to every account to a critical system folder. Imagine a malicious user could replace an EXE or update a script to call malware or ransomware. If the agent procedure doesn't replace these scripts and blindly calls them - often with SYSTEM rights - the damage could be extensive.</div>

<div>&nbsp;</div>

<div>Why risk this? Take time to review your procedures and tools to make sure they are still required and operate in compliance with today's security model. Remove processes that are no longer needed, and update those that are still needed to follow current security requirements. The business you save might be your own!</div>
<br /><a href='http://mspbuilder.com/blog-how-outdated-are-your-management-scripts'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-how-outdated-are-your-management-scripts'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-how-outdated-are-your-management-scripts</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/blog-how-outdated-are-your-management-scripts</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-how-outdated-are-your-management-scripts</guid>
      <pubDate>Sun, 23 Feb 2020 14:00:00 GMT</pubDate>
    </item>
    <item>
      <title>Managed Variables - Take 2</title>
      <description><![CDATA[<p><strong>VSA Managed Variables, Problems, and SaaS-Friendly Solutions</strong></p>

<p>Well, just one month ago, I discussed using Managed Variables, and reviewed some of the challenges with them.&nbsp; A current bug in VSA leaves us with the inablilty to deal with undefined Managed Variables, which means we can't use the presence or abscense of a Managed Variable to decide IF we should perform a task.</p>

<p>I use Managed Variables - a lot - for everything from passing arguments to our utilities to control how they function to defining local accounts and even customer licensing and configuration information for commonly deployed applications. At last count, I've got 16 Managed Variables. Nearly all of them, in fact, all but two of them, define customer-specific information that may not exist for every customer, so the ability to detect when the data is defined is crucial.</p>

<p>Last month, I provided a workaround that we've been using internally for almost 4 years. It's a simple SQL Query that uses two arguments - the name of the Managed Variable and a default response. It reads the Managed Variable and returns the data if its defined or the default value if it isn't. This makes it easy to detect that the default value was returned and skip the action that depends on the variable being defined. Easy-peasy! Well, that's the case if you're an on-prem user of VSA. Kaseya SaaS users have, well, let's just say "challenges".</p>

<p>I understand that caution is needed in a shared environment, and that allowing access to the SQL back-end, even for queries, can create great angst. We've been waiting over 6 weeks to get the query approved for our TAP instance of Kaseya SaaS so that we can certify our automation. This delay was just too much to bear, especially with SaaS users asking about our automation suite, so - it was time to dig in and find an alternate solution. The challenge was that reading an undefined Managed Variable didn't just fail, it caused the procedure where it was referenced to crash at that point! The answer came - literally - while lying awake at night...</p>

<p><em>"If reading an empty Managed Variable crashes the procedure, then let a different procedure crash!"</em></p>

<p>Here's the process, in a nutshell. Start by creating a procedure that assigns the Managed Variable(s) that you need to global variables:</p>

<pre>
<code>getVariable("ConstantValue", "&lt;InitArgs&gt;", "global:MV_InitArgs", "All Operating Systems", "Continue on Fail")</code></pre>

<p>The global Managed Variable name uses a "MV_" prefix to identify it as a Managed Variable in this example, and uses a name that makes its purpose clear. In this case, it will contain any custom arguments used by our agent initialization utility. In some cases, there are multiple Managed Variables needed, and we simply collect all of them in the same procedure - each with a unique name. One such example is an application the needs the customer's product serial number, license key, and customer ID to perform an installation.</p>

<p>Next, in the procedure that needs the Managed Variable(s):</p>

<pre>
<code>getVariable("ConstantValue", "xFALSEx", "global:MV_InitArgs", "All Windows Operating Systems", "Halt on Fail")
executeProcedure("ALL-GetManagedVar-InitArgs", "", "Immediate", "All Operating Systems", "Continue on Fail")
If checkVariable("#global:MV_InitArgs#") Contains "FALSE"
  executeShellCommand("CMD.exe /c RMMINIT.BMS", "Execute as System", "All Windows Operating Systems", "Continue on Fail")
else
  executeShellCommand("CMD.exe /c RMMINIT.BMS #global:MV_InitArgs#", "Execute as System", "All Windows Operating Systems", "Continue on Fail")</code></pre>

<p><br />
This calls the first procedure with "Continue on Fail". If the Managed Variables aren't defined, that procedure fails, but does not affect the primary procedure. That procedure is able to determine that the default value is still defined and will take appropriate action. In this case, it runs the init command without passing arguments, but if the default value is not detected, it runs the init command with the arguments in the global variable. Note that the global variable is defined and set to a default value here before calling the external procedure that will reset it if the Managed Variable is defined.</p>

<p>I'll admit, this is a bit kludgey, but it does allow one to leverage Managed Variables on all platforms, including SaaS, without causing the primary procedure to crash.</p>
<br /><a href='http://mspbuilder.com/blog-managed-variables-take-2'>gbarnas</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-managed-variables-take-2'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-managed-variables-take-2</link>
      <author>gbarnas@mspbuilder.com (gbarnas)</author>
      <comments>http://mspbuilder.com/blog-managed-variables-take-2</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-managed-variables-take-2</guid>
      <pubDate>Mon, 19 Mar 2018 20:15:00 GMT</pubDate>
    </item>
    <item>
      <title>Using VSA Managed Variables</title>
      <description><![CDATA[<p>VSA Managed Variables offer an excellent way to provide customer and machine-group specific values for use in Agent Procedures. We have more than 20 unique Managed Variables defined to help configure application licenses, define local account credentials, and control how applications are deployed and used. It suffers, however, from one very debilitating bug. If the variable is not defined for all customers and machine groups, it can cause the procedure to fail when the variable is not defined. There is no way to test the variable to determine if it is defined or not, and the simple act of referencing the undefined variable will terminate the procedure, even if "Continue on Fail" is selected.</p>

<p>MSP Builder has developed a work-around for this using a SQL Query. The query is invoked with two user arguments - the name of the Managed Variable and a Default Value. The query looks up the Managed Variable based on the agent's machine.group value. If the value is defined, it is returned, otherwise the default value is returned instead. This allows you to perform tasks using Managed Variables with greater flexibility - either use a custom or default value, or perform a task only when a custom value is defined. We use this to create a customer-specific local account, only for customers that request this.</p>

<p>You can download the XML file that defines the SQL Query. The Zip file contains the XML file and a readme that illustrates how to use this in a procedure, and describes the installation steps. For SaaS customers, you will need to create a request to have this query installed into your instance. Do not change the XML filename or contents to insure that your request is not delayed. Kaseya will need to review the procedure and approve it before installation for SaaS customers.</p>

<p><span class="font-large"><strong><em>UPDATE!!</em></strong></span></p>

<p><span class="font-normal">See "Managed Vars - Take 2" for a method that works on SAAS or On-Prem without any SQL code!</span></p>

<p>&nbsp;</p>
<br /><a href='http://mspbuilder.com/blog-using-vsa-managed-variables'>Admin</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-using-vsa-managed-variables'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-using-vsa-managed-variables</link>
      <author>support@mspbuilder.com (Admin)</author>
      <comments>http://mspbuilder.com/blog-using-vsa-managed-variables</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-using-vsa-managed-variables</guid>
      <pubDate>Mon, 19 Feb 2018 17:53:00 GMT</pubDate>
    </item>
    <item>
      <title>Good Habits for Writing Procedures</title>
      <description><![CDATA[<p>I participated in a panel at Kaseya Connect that talked about automation. One of the audience questions was "how do you manage creating procedures by your team?". That simple question led to this blog post, where I'll explain in more detail what we do.</p>

<p>First of all, I encourage all of our team members to create procedures. I use a small document "Intro to Programming" that I wrote when I taught a Windows Administration course at a local college. It's very basic and presents concepts that can be easily understood, even by those with little or no exposure to programming. This doesn't mean, however, that I turn my engineers loose as "wannabe programmers"!</p>

<ol>
	<li>As an introduction, I have a 1 page sheet that explains how to create Kaseya procedures. This covers the following points:
	<ul>
		<li>All procedures must have a complete and accurate description field that explains the purpose of the procedure.</li>
		<li>Each procedure must have liberal comments that define the logic - what are you trying to do? This allows other engineers to understand what's being done should things go awry sometime in the future.</li>
		<li>Start with the procedure template - it illustrates the standard ways that we perform certain tasks, such as defining the KWorking folder; verifying and using the TEMP Folder whenever possible AND cleaning up after the install or process is complete; and dealing with Managed Variables in a way that provides a default value if the MV isn't defined. (The link above is a zip file with the Intro to Programming PDF and the Kaseya Procedure template.)</li>
	</ul>
	</li>
	<li>Each procedure undergoes a code review before the engineer can even test-run it. The code review verifies the above points were followed, checks the basic logic, and insures that common code methods are used. Once the review is complete, the procedure is approved and the engineer can test their code. Yes, this takes work, especially when the engineer hasn't thought the logic all the way through! It does, however, insure that any engineer can support any of the procedures, and makes them more effective at this task. The unwritten goal is to get a procedure published for production in the minimum number of review/approval passes! It's also an opportunity to explain why one logic method might be preferred over another, and why standard methods - even when they contain more lines of code - are better in the long run.</li>
	<li>When testing is complete, one of the NOC Management team members will move the procedure into the shared folder and send an announcement that the new procedure is available.</li>
</ol>

<p>Much of what we do builds on standard ways of operation, and the procedures are no exception. Standards result in consistency of operation, which improves reliability. Reliability in the environment is a double-win for us - we get fewer alerts and the customer has fewer problems, making them more willing to work with us on projects. The effort needed to follow thes standards is minimal in comparison to the dividends it pays.</p>

<p>&nbsp;</p>
<br /><a href='http://mspbuilder.com/blog-good-habits-for-writing-procedures'>Admin</a>&nbsp;&nbsp;<a href='http://mspbuilder.com/blog-good-habits-for-writing-procedures'>...</a>]]></description>
      <link>http://mspbuilder.com/blog-good-habits-for-writing-procedures</link>
      <author>support@mspbuilder.com (Admin)</author>
      <comments>http://mspbuilder.com/blog-good-habits-for-writing-procedures</comments>
      <guid isPermaLink="true">http://mspbuilder.com/blog-good-habits-for-writing-procedures</guid>
      <pubDate>Mon, 15 May 2017 20:21:00 GMT</pubDate>
    </item>
  </channel>
</rss>