Let’s face it, your next web or digital project is going to be a massive undertaking. Whether it’s a brochure-style marketing website or a full-blown enterprise e-commerce solution, making the right choices up-front is going to set the tone for the entire lifecycle of the site – not just the development phase.

One of the most critical elements to decide on up-front is which platform you’ll be using. Which one should you use, anyway? There’s truly no one-size-fits-all solution, but here’s a handful of points to consider that can help put you on the path to success!

The Right Platform

What is the scope of my project?

If you’re throwing up a temporary microsite in a hurry – do you really need a CMS and the development costs that go along with it? If your site is going to be a long-term investment, what does the future hold? Today, you may need a single corporate website that can be managed by 2-3 people. Something like WordPress would be a fast, inexpensive solution for this type of need. But what if during the lifetime of the site you expect to support multiple sites for new international regions or audiences? Will you want to grow into e-commerce? In this case, something like Magneto or Sitefinity Enterprise might be a better fit.

Who will be managing the content?

That is, who will be making the day-to-day changes and updates to the content? Will it be someone with a technical or development background? Or is it more likely to be a marketing pro or channel manager?

If the system will be managed by someone more technical, consider a system that is more configurable and flexible. Many open source options such as Orchard, Drupal, and Joomla are highly configurable and extensible and free of any sort of licensing costs. However, these systems require more development labor to set up on the up-front and their interfaces are geared toward developers, not content managers.

If you expect to have less-technical, marketing-focused users doing the bulk of the work in the system, focus your search on something more intuitive. Systems like Sitefinity, Sitecore, Magento, and WordPress allow for structured templates geared toward simplifying content management for non-technical users.

How will the site be supported?

Where will you go when you need help? At some point, you’ll likely need work done to your system that exceeds your internal capabilities. You may need a custom integration, or perhaps you’ll run into a bug that you can’t tackle on your own.

If you’re working with a vendor to do the initial site build, will they also be available to support the site post launch? What is their support model, and how are they staffed? If your site goes down, what will the impact be on your business? If you expect little impact, you can likely make your decision based on cost. If a prolonged outage or major bug means a big hit to your bottom-line, look for a partner that can provide professional support via multiple lines of communication so you can expedite your fix.

Also, think about the long-term “supportability” of the solution itself. Platforms that are well-known, such as WordPress, with a long track record will have a huge following. Likewise, paid solutions like Sitefinity or Magento Enterprise have a team on-staff that will help solve problems. Using a dated or unpopular solution could leave you with a problem that few can help you to solve.

Is the CMS a good fit for ALL parties?

This may seem obvious, but be sure to view a demo of the product or try it on a small scale. Get a feel for the interface and be sure you and your teammates are comfortable. A feeling of confusion or being overwhelmed will be a barrier to maintenance of your site down the road.

What may be less obvious is gauging the comfort level and fit with your vendor and technical resources. If you already have a relationship with a trusted web vendor – check with them! Be sure they’re comfortable with your preferred CMS. If they’re not, you could be introducing unnecessary cost and technical debt to your project.

Summary

There’s still a ton of work to do – but now you can start thinking about what to look for to ensure your CMS is the right fit! Still feeling overwhelmed contact us and we can help you uncover the answers that will ensure you make the right choice for your needs.

You Get a Script, You Get a Script, Everyone Gets a Script!

Scripts, scripts, scripts, scripts! Let’s talk about Scripts! Recently, at Mercury New Media we had a client request that we upload IIS log files to their server on a nightly basis. The request was to rename IIS logs to follow the requested naming convention and to upload the log files in a zip folder each night. We tossed a few ideas around like console apps, custom services and ultimately settled on a PowerShell script executed by our VM’s task scheduler to fulfill our client’s need. A simple yet effective solution that would be portable and flexible.

Setup

Before we started working on uploading our client’s files, we made sure that were able to use the PowerShell ISE for v3 and v4 of PowerShell. This is installed on Windows 8 by default which meant there was no download required. Instead all we had to do was launch the ISE and start to dig in! To do this you can search your Windows 8 Machine for PowerShell ISE and be on your merry way!

Step One

To accommodate the client’s needs, we first took a look at renaming IIS log files and zipping them up into the request archives. Luckily, IIS conveniently names log files with a dated naming convention located within its respected site folder making it easy to sort through log files and select the appropriate log for the day. Once located, we could start naming and zipping files to our hearts content! A fairly straightforward task that can be executed with the PowerShell script below:

$date = (get-date).AddDays(-1).ToString("yyMMdd")
$ZipTimestamp = (get-date).AddDays(-1).ToString("yyyyMMdd")
$DestZip='C:\inetpub\logs\LogFiles\W3SVC32\'
$Dest = "C:\inetpub\logs\LogFiles\W3SVC32\u_ex" + $date + ".log"
$ZipFileName  = $DestZip + $ZipTimestamp + "Naming Convention" + ".zip" set-content $ZipFileName ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18)) 
# Wait for the zip file to be created.
while (!(Test-Path -PathType leaf -Path $ZipFileName))
{    
    Start-Sleep -Milliseconds 20
} 
$ZipFile = (new-object -com shell.application).NameSpace($ZipFileName)
Write-Output (">> Waiting Compression : " + $ZipFileName)      
$ZipFile.CopyHere($Dest)

The script locates our log files, renames them to the clients requested naming conventions, and then zips the files all in a matter of milliseconds. Pretty quick and easy if you ask me! 

Step Two

Next, we needed to upload the files to our clients FTP server! Another task that can be easily tackled by our beloved PowerShell script!

The client was kind enough to provide FTP information so all we needed to do was grab the files and ship them off to their new home! Now if you notice from the script above, the file is copied to our directory, but never removed. In the next step, we also decided we should remove the unnecessary zipped files once uploaded to avoid cluttering our log folders. The PowerShell script below takes our fresh zip files, uploads them, and then removes them from the directory:

#Send File Through Internet Tubes
$file = $ZipFile.Title
$ftp = [System.Net.FtpWebRequest]::Create("ftp://clientdirectory/$file")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("username","password")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$content = [System.IO.File]::ReadAllBytes("C:\inetpub\logs\LogFiles\W3SVC32\$file")
$ftp.ContentLength = $content.Length
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
$rs.Close()
$rs.Dispose()
Remove-Item C:\inetpub\logs\LogFiles\W3SVC32\$file

Wrap Up

The script sends our files to the new client directory and then removes the file from our local folder as to not clog up our directories. We scheduled this PowerShell script to run each morning at 12:01 AM by using Windows Task Scheduler. A simple script was all we needed to tackle this ongoing task!