WordOps is a popular framework to setup and manage WordPress infrastructure, forked from EasyEngine in 2018 when EasyEngine v4 changed their architecture to use Docker containers.
This is a quick guide to setting up a development copy of your live WordPress site which can be useful if you’re working on a redesign or testing new themes.
If you’re not familiar with the WordPress CLI (WP), it does some magic behind the scenes to extract the database connection details from wp-config.php, so as long as we’re in the directory of a WordPress installation it will figure out what to do.
1. First, we’ll create the new site (dev.mysite.com) as a WordPress site with Let’s Encrypt enabled
sudo wo site create dev.mysite.com --wp --le
2. Next, we’ll cleanup the default content on the newly created dev site and copy the files from the production site
sudo rm -rf /var/www/dev.mysite.com/htdocs/*
sudo cp -a * /var/www/dev.mysite.com/htdocs
3. Now we can use the WP CLI to export the database from the production site and import it to the dev site.
sudo -u www-data -H wp db export ~/mysite.com.sql --yes
sudo -u www-data -H wp db clean --yes
sudo -u www-data -H wp db import ~/mysite.com.sql --yes
4. Finally use WP CLI to update the WordPress URLs on the dev site
Atlassian don’t officially support AD FS with Confluence Cloud – but it is working well now I’ve sorted out the issues I was having passing user’s email address through as the nameId claim. Hopefully these instructions can save you some trial and error.
Enable SAML on Atlassian Cloud
First off – enable SAML on your Atlassian Cloud instance at https://<subdomain>.atlassian.net/admin/saml/edit
The Identity Provider Entity ID can be found in the Federation Service Properties in ADFS – but typically will look like mine –
Identity Provider SSO URL can be found in AD FS Service > Endpoints – look for the SAML 2.0 type, but it should just be
Open up your token signing certificate in AD FS, then select ‘Copy to file’ from the Details tab. Save with Base64 encoded as a txt file – then copy the contents into the Public x509 certificate field.
Add Relying Party Trust wizard
Add a Relying Party Trust to AD FS. On the welcome page select ‘Enter data about the relying party manually’
Select a display name – i.e. Atlassian Confluence
Use the AD FS profile (supports SAML 2.0)
Leave the token encryption certificate blank
Enable support for the SAML 2.0 WebSSO protocol – and enter the SP Assertion Consumer Service URL from the Atlassian Site Administration > SAML section. Currently this is:
For the relying party trust identifier, enter the SP Entity ID – currently this is
Please note, do not be tempted to add additional relying party trust identifiers (I had added some others in here which caused it not to work)
Optionally configure multi factor authentication settings
Configure the claim rules
First create a rule to send attributes from Active Directory to Atlassian Cloud. I think the only mandatory claim is the email address.
Next, add a second rule to Transform an incoming claim
(this is another step I hadn’t figured out the first time I tried to configure SAML – without this step it seems like ADFS doesn’t use the right format for the outgoing name ID).
Test it out
I haven’t got Identity Provider initiated sign on working yet (via the /adfs/ls/idpinitiatedsignon.aspx) – but if you use a RelayState URL – and then put this in your corporate bookmarks etc it should work nicely (replace the <yoursubdomain> part
Found a strange issue after switching to Asterisk v11, the BLF buttons on our Panasonic SIP phones stopped working.
Eventually thought it was worth trying setting the ‘presence server address’ which previously hadn’t been set (and in Endpoint Manager is set to blank) to the PBX and the lights immediately started working. Much simpler solution than I anticipated.
Had a frustrating issue with some UniFi APs where clients were not able to authenticate to the Pro models, but OK to the standard UniFis.
Running a packet capture on the NPS server I could see many Access-Requests arriving at the server with an Access-Challenge immediately being sent back, but the AP would just keep sending the same request and the server was neither Rejecting or Allowing the connection.
The MS article recommends to use a Framed-MTU of 1344, but ended up settling on 1400. We did had Jumbo frames enabled on the server running NPS role which I think may have been contributing to the problem. Hope this can help someone out!
Veeam, known for being one of the leading providers of enterprise virtual backups have just announced they will be releasing a free backup tool for desktop users, providing automatic backups to a NAS or other hard drive.
Veeam Endpoint Backup FREE looks like it will be a great set and forget solution, allowing both simple file recovery or bare metal recovery. Using Mac OS X at home with Time Machine, I often wish there was a good free equivalent to recommend for Windows users. I’m sure there are options out there, but I really trust Veeam and it looks like it will be a nice simple product with no pressure to upsell to a paid version (there is none).
Don’t forget you still need an offsite backup – so team this up with a cloud backup, have another hard drive which you rotate offsite or buy a pair of NASes- setup replication and distribute among your family (those 50Mbps upload UFB plans have to be good for something right?).
If backing up to NAS, very good idea to setup another shared folder & separate user account on the NAS specifically for the backups to be saved to. Never ‘map’ this backup folder to My Computer, and your own user account should have read only access to the folder. Hopefully we will be able to configure a UNC path and credentials within Veeam directly. This is to help minimise the possibility of ransomware or other malware which might scan your network for files to delete. I haven’t heard of anything doing this yet, but there is definitely malware out there which deletes or encrypts files on mapped network drives.
First beta will be released in November and scheduled for release in early 2015.
Chorus have updated their maps in the last week to show UFB is now available in more areas in Greymouth.
It is still unclear which ISPs will be providing UFB in Greymouth – Snap is probably your best bet currently for home and basic business use; DTS can do business connections and apparently HD are offering both residential and business connections. I assume Spark will also provide access here soon if they aren’t already. New fiber-only ISP, MyRepublic looks interesting, but they said they need a few interested people to sign up at once before they would install the required equipment in Greymouth.
Ultrafast broadband really changes the whole way we can think about how we use technology at home and business. At a ~50 user site, the changes we are looking at immediately once UFB is installed include:
Moving our email (Exchange) over to Office 365, instead of having to maintain an email server on-site
Using Windows Updates directly from Microsoft instead of caching them all on a server locally
Switching from a traditional web content filtering + caching solution to a fast, NGFW (Next-generation firewall) to reduce potential points of failure and bottleneck
Shifting more phone lines across to Voice over IP
Making more use of online backup services
Providing better remote access for staff wanting to work from home
Caution is also required going into the future. If your phone line is switched over to being provided through UFB instead of Chorus copper, you will lose phone access during power cuts. One thing that Telecom (Spark) have been fantastic at in the past is providing an incredibly reliable phone network, even in a power cut corded phones would still work, and even with a cable/fiber cut you were still able to at least call people locally.
Of course this is less important for some people these days with most people having cell phones, but we know from the Christchurch earthquake that cellphones a) also need power and b) get overloaded so can’t be relied upon.
The good news is that Spark currently aren’t requiring you to give up your landline, this will change in the future. When it does change, make sure you buy a good quality UPS (uninterruptible power supply) which could at least keep your phone running for a few hours. Let’s hope that they come up with a good quality, affordable UPS at the time that Spark start switching people over.
We finally got our new Epson EB-4650 (unsure on exact model) projector connected to the network this week, allowing me to complete our Roomie Remote setup: controlling a projector, Marantz receiver, DVD and Freeview box.
Although Roomie Remote had a one-size-fits-all Epson projector definition, I couldn’t get it working with IP control.
Knowing the projector supports PJ-Link, I set out to see how easy it would be to implement the well documented PJ-Link protocol in Roomie.
Without further ado:
Back up your Roomie settings to Dropbox
Download plistEditor Pro (it is either trialware or shareware)
Open the Dropbox\Roomie\RoomieCodes.plist file. If it doesn’t exist, create one
Add in the code below. We only need to switch between LAN and HDMI1, so I haven’t tested the other inputs, but feel free to tweak the Gist below.
Save, restore the settings from Dropbox into Roomie
Create a new device, entering the projector IP and PJ-link port 4352, select ‘Generic’ – ‘PJ-Link Compatible’
I’m not sure why PJ-Link isn’t included in Roomie, but until it is this should let you control a decent number of auditorium/installation projectors over IP.
We can now leave our four remotes in the drawer where they belong, instead able to use one touch actions to power up the devices and choose the right inputs – all from an iPod touch or iPad.
P.S. This will only work for projectors not requiring PJLink authentication, I didn’t look into seeing how to do that.
Over the past six months I’ve been developing a Student Management System based on Dynamics CRM 2011 for one of the new Trades Academies. I’ll talk about why we chose Dynamics CRM in a later post, but this post is about the integration I built with the TextaHQ SMS Messaging service.
TextaHQ was attractive for no monthly fees, low per message cost and a two way API allowing SMS replies. When replies come back the gateway sends the reply to a Callback URL allowing us to save the message straight into CRM. Not so great if your server goes down for a few hours, but it does mean we don’t have to be running a service to poll for new messages like some APIs.
I would love to have published this up into a nice how to guide but probably not going to have time to do that for a while, so I thought I’d code dump for now instead.
My solution consists of three parts, the SMS Message entity, the plug-in assemblies for sending the messages and a ASP.NET form to save the messages back into CRM.
SMS Message Entity
A new ‘activity’ entity named SMS Message
Add a field named sendernumber – this is where the sender number of mobile replies will be put
These are the status codes I am using:
Draft (1) – Default Value
Pending Send (352,400,002) – Default Value
Cancelled (3) – Default Value
Setup the form – this is what mine looks like
Create a Web Resource called ‘smsconfig’ – an XML file. Format it like below with the URL and API key from your TextaHQ account
Contains a (rather bad) phone number cleaning method; a method to read the url & key from the configuration XML file; code for querying the ‘smsconfig’ web resource and the code to post the message to the gateway
Contains the definitions of the statuscodes I defined above
The code that should be triggered when the statuscode of the smsmessage entity is updated
Checks if the status code is in ‘Completed_Pending’ send state (user clicks ‘Save and Complete’ on the SMS Message activity’)
Retrieve the needed data from fields, check the message isn’t blank
If the regarding entity is a contact, sends the message to the contact
If the regarding entity is a course (you can delete this functionality if you like), it sends the message to all of the contacts enrolled in the course with a mobile phone
Updates the SMS Message record to the Completed – Sent status (or Open Failed if it doesn’t manage to send any messages)
We send the Guid of the contact the message is being sent to as well as the Guid of the creator of the message to allow as user data to the TextaHQ API – this data is stored with the message and if a reply comes back the data is fed back to us. That allows us to assign the reply back to the original sender and set it regarding the correct contact.
This cool bit of code lets you send SMS messages from workflows! It takes the following parameters
User to assign replies to (system user/owner)
Then returns a MessageSent boolean to let you know if it sent or not.
In fact, if you wanted you could actually just register this workflow activity and forget about the SendSMS.cs – but I needed SendSMS.cs to allow me to send a SMS message to a whole course full of students.
(You would just setup a workflow to trigger when statuscode of sms message is set to completed – pending, then send SMS with the appropriate variables, then if it manages to send update the status code to completed – sent or open – failed)
Registering Plug-in assembly
Build the plug-in assemblies and register – this is what the step looks like for me for SendSMS.cs
You should now in theory be able to send SMS messages. I’ve added a ‘Save and Complete’ button to the toolbar for SMS Message activities, and renamed it ‘Send SMS’.
Sorry I don’t have time to tidy this up and write a proper instruction, but there are some other good posts online which I used to help me get this far.
I would have liked to implement party lists to allow sending to multiple contacts, but don’t really need it at this stage.
Hopefully you might find some useful code snippets that you can adapt for use in your project.
One day I might release it all packaged up as a solution!
I’ll post my SMS reply processing ASP.NET form soon to complete the puzzle.
This isn’t a free tip, but works well for the networks I manage. One of the challenges for any Systems Administrator is keeping software up to date. I’m not so concerned about actually having the latest version of software so much as making sure if there are any security updates these are taken care of in a low effort way.
In your network documentation you should consider every application you have installed on your workstations and determine a software update strategy for each. Our Microsoft products are taken care of by Server Update Services, our Antivirus looks after itself and now we have Ninite for the rest.
If you haven’t come across Ninite before, it is a neat wee tool to install your favourite applications with a couple of clicks.
Ninite Pro adds some awesome features which allow this, such as a command line/silent mode, one touch software updates and caching software downloads. I subscribed to the $20/month plan for up to 100 computers.
There are lots of cool things you can do with the command line reference etc, but all I need is the update mode (which updates any of the Ninite supported software which you have installed on your computer), and to set it up to run on a regular basis. In my case, every time a computer is turned on.
Here is my standard configuration for Ninite
Setup a service account with a secure password for Ninite in Active Directory and document the password in LastPass. It will require permissions to install software on your workstations.
Setup a network share for Ninite and add permissions for the Ninite service account.
Put your copy of NiniteOne.exe in the share and create a Logs folder
Setup a Scheduled Task in Group Policy > Control Panel Settings > Scheduled Tasks
Run whether the user is logged on or not, run tasks as your service account. Configure for Windows 7. Currently investigating a better option for this. That would require storing the user credentials for Ninite service account in Group Policy which is actually easily accessible by malicious users.
Triggers – At system startup. You may wish to delay task for 10 minutes, I have it running immediately.
Actions – Start a program
\fileserverNinite$NiniteOne.exe /silent \fileserverNinite$Logs%ComputerName%.txt /updateonly /disableshortcuts
Conditions – Start only if the computer is on AC power
Test it out, when you restart your test workstation a log file should be created for the workstation in the Logs folder, and any software supported by Ninite should be updated and cached in the network folder for a quick install on other machines.