Jim's blog | .Net and SQL Server blog

Automated Teacher’s Supply Lists

by JBrooks 18. May 2017 15:59

I was exposed to the back to school scavenger hunt for the first time when our daughter was entering 7th grade. (Don’t know how I avoided it for so long.) We spent 2 hours in Staples looking for a very specific folder. Either Staples didn’t carry it or the store we were in was out of them. My wife informed me that we would be trying another Staples.

I’m a software developer and couldn’t believe the stores were packed with parents who were spending so much time on this antiquated and inefficient process. My wife explained to one of the reasons the process was so inefficient was that the items on the list can suffer from one of two types of frustration. Either the description was too specific making the item hard to find, or it was too vague leaving us to make our best guess at what we should buy.

That weekend I started a website called SupplyChasers.com where teachers can create a supply list and point to the exact items on Amazon.com. It is simple, easy to use, free, and the teachers can make as many lists as they want.

To add items to their list the teacher starts by searching for an item on Amazon.com. When they find the exact item then they copy its URL from their browser’s address bar and paste it into their list. They do this for each item they add.

When an item’s URL is added to the list the website uses it to go to Amazon.com behind the scenes and retrieve the item’s image, manufacture’s name, description, and estimated price. These are all displayed on the teacher’s list. The URL is also used to create a link from the teacher’s list to the product’s page on Amazon. This is useful when the parents are viewing the teacher’s list from their phones while shopping.

Each list has a unique URL that can be posted on the teacher’s website or can be emailed to the parents.

Of course, teachers can add items to their list don’t have an Amazon link. But for the items that do the surprisingly productive part is that the parents can set the quantity for each item and then click the “Buy from Amazon.com” button. When they do this, all of the Amazon items from the teacher’s list with a quantity greater zero are automatically added to the parent’s Amazon cart ready for checkout.

The SupplyChasers.com lists save the parents time and frustration and ensures that the students have the correct supplies for the teacher.

Tags:

Database Continuous Integration – DeployTest

by JBrooks 23. February 2017 15:44

Part 4 of 4 – Main PowerShell Script

You can download all of the code here. Below is the main PowerShell script called Process-InBasket.ps1.  It might be scheduled to run every few minutes. It will look for new commands and if any it will call OneScriptClient passing in the parameters found in the command.

Then it will run if it finds a new *.sql file in the In-Basket directory. It will restore a resent backup of production, apply the script and then email the developers with “Success” or “Failed” in the subject. The body of the email will be the log and attached will be a file with the results of the script running.

<# .SYNOPSIS Process new SQL Scripts files by applying them to a SQL database. .DESCRIPTION Process-InBasket loads a Config file that controls all variables in the process. It then looks to a directory to search for new *.SQL files. Once a new file is found, a backup of the database is restored to SQL Server. The script is then applied. The process then looks for an SQL errors. Then it sends an email of the log with "Success" or "Failure" in the subject. .PARAMETER ConfigFile The config file to use to process the deployment. Default: ./TestDeployConfig.confg .EXAMPLE PowerShell "C:\Work\Deploy\Process-InBasket.ps1" ./TestDeployConfig.xml #> [CmdletBinding()] param ( [string] $ConfigFile='./TestDeployConfig.xml' ) cls $VerbosePreference = 'Continue' Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass # get a temp file name so we can start logging $Private:logFileName = [IO.Path]::GetTempFileName() Set-Location $PSScriptRoot # load the support scripts . ".\DeployUtils\Load-DeployConfig.ps1" . ".\DeployUtils\SQLServer.ps1" . ".\DeployUtils\Utils.ps1" Log "Root: $PSScriptRoot" $logFileName Log ("Loading Config File $ConfigFile") -LogFileName $logFileName #This will also create and load C.FileSettings.LogFileName $Config = Load -ConfigFileName $ConfigFile -LogFileName $logFileName Log "" Log ("Processing Matches: "+ $Config.C.FileSettings.InFileMask.InnerText) $Private:processCount = 0 $Private:ret; foreach($Private:CmdFile in (Get-ChildItem $Config.C.DirSettings.InCmdDir.InnerText)) { $content = Get-Content $CmdFile.FullName $cmdArgs = $content[0] $author = $content[1] $cmd = '"C:\Program Files\DigitalTools\OneScript Client\OneScriptCMD.exe" ' + ` $cmdArgs + ' -do ' + $Config.C.DirSettings.InBasketDir.InnerText Invoke-Expression "& $cmd" Move-Item -Path "$($CmdFile.FullName)" -Destination ` $Config.C.DirSettings.ArchiveDir.InnerText -Force } foreach($Private:SQLFile in ` (Get-ChildItem $Config.C.FileSettings.InFileMask.InnerText)) { Log "" Log ("Processing: "+ $SQLFile) $Private:dt = Get-Date -Format yyyyMMdd_HHmmss $Private:temp = $Config.C.FileSettings.LogFileFormat.InnerText.Replace("{0}", $dt) if($processCount -eq 0) { Log 'Changing log file to $temp' Move-Item -Path $logFileName -Destination $temp $Config.C.FileSettings.LogFileName = $temp $Private:DBSourceFile = Get-ChildItem ` $Config.C.FileSettings.DBSourceFileMask.InnerText | ` Sort-Object Name -Descending | Select-Object $_.Name -First 1 Log ("SQL Backup File: " + $DBSourceFile) if(!$DBSourceFile) { Write-Error ("Searched: " + ` $Config.C.FileSettings.DBSourceFileMask.InnerText) Write-Error "No SQL Backup File Found." Log "No SQL Backup File Found." Send-ToEmail -Subject "Deploy Failed!, No SQL Backup File Found" ` -LogFileName $Config.C.FileSettings.LogFileName exit -1 } } else { $Config.C.FileSettings.LogFileName = $temp Log(("Processing file number: " + ($processCount + 1).ToString())) Log ("") } Log ("Log File: " + $Config.C.FileSettings.LogFileName) $Private:buildResultsFile = ` $Config.C.FileSettings.BuildResultsFileFormat.InnerText.Replace("{0}", $dt) Log ("Build Results File: " + $buildResultsFile) $ret = RestoreSQLServer $Config.C.DBSettings.Server $Config.C.DBSettings.Database ` $Config.C.DirSettings.DBFilesDir.InnerText $DBSourceFile if($ret -ne $true) { Write-Error "Restore Database Failed" Log "Restore Database Failed" Send-ToEmail -Subject "Deploy Failed!, Restore Database Failed" -LogFileName ` $Config.C.FileSettings.LogFileName exit -1 } Log ("Running: "+ $SQLFile) $ret = RunSQLFile $Config.C.DBSettings.Server $Config.C.DBSettings.Database ` $SQLFile $buildResultsFile if($ret -ne $true) { Write-Error "Running SQL Script Failed" Log "Running SQL Script Failed" Send-ToEmail -Subject "Deploy Failed!, Running SQL Script Failed" ` -LogFileName $Config.C.FileSettings.LogFileName exit -1 } Log "SQL Script Run Successful" $Private:hasError = CheckForErrors $buildResultsFile if($hasError -eq $true) { Write-Error "SQL Script Had Errors" Log "SQL Script Had Errors" Send-ToEmail -Subject "Deploy Failed!, SQL Script Had Errors" ` -LogFileName $Config.C.FileSettings.LogFileName ` -Attachmentpath $buildResultsFile } else { Log (("Deployment Successful")) Log (("Completed " + $SQLFile)) Send-ToEmail -Subject "Deploy Script Successful" -LogFileName ` $Config.C.FileSettings.LogFileName -Attachmentpath "$buildResultsFile"; } Move-Item -Path "$buildResultsFile" -Destination ` $Config.C.DirSettings.ArchiveDir.InnerText -Force $SQLFile = $SQLFile -replace ".sql$", ".*" Move-Item -Path "$SQLFile" -Destination ` $Config.C.DirSettings.ArchiveDir.InnerText -Force $processCount = $processCount + 1 } switch($processCount) { 0 {Log (("No files found to process"))} 1 {Log (("1 Script Processed"))} default {Log (($processCount.ToString() + " Scripts Processed"))} }

Tags:

PowerShell | Continuous Integration | Development | OneScript

Database Continuous Integration – DeployTest

by JBrooks 17. February 2017 17:47

Part 3 of 4 - Directories and Configuration


Below are the default directories I use. These could all be under c:\DeployTest or something.
Directory Type of Files
Archive Completed files are moved to once processed
DatabaseFiles SQL Server database data files
DBBackup Resent backups of production for restoring
DeployUtils PowerShell support utilities
InBasket Incomming files needing to be processed
InCmds Incomming commands used to trigger processing
Logs Log files
Working Process working files


I didn't want to hard code the directory paths because I could make changes like moving the Archive directory to a shared drive like "\\MyShare\MyProject\Archives". I will also have multiple project using my PowerShell code so having config files makes it easier to work with.

The config file for the directories above looks like:

<?xml version="1.0" encoding="UTF-8" ?> <C> <DirSettings> <BaseDir>C:\DeployTest\</BaseDir> <InBasketDir Parent="BaseDir">InBasket\</InBasketDir> <WorkingDir Parent="BaseDir">Working\</WorkingDir> <DBSourceDir Parent="BaseDir">DBBackup\</DBSourceDir> <ArchiveDir Parent="BaseDir">Archive\</ArchiveDir> <DBFilesDir Parent="BaseDir">DatabaseFiles\</DBFilesDir> <InCmdDir Parent="BaseDir">InCmds\</InCmdDir> <LogDir Parent="BaseDir">Logs\</LogDir> </DirSettings> <FileSettings> <InFileMask Parent="InBasketDir">Northwind_Update_*.sql</InFileMask> <DBSourceFileMask Parent="DBSourceDir">Northwind_*.bak</DBSourceFileMask> <BuildResultsFileFormat Parent="LogDir">BuildResults_{0}.log</BuildResultsFileFormat> <LogFileFormat Parent="LogDir">TestDeploy_{0}.log</LogFileFormat> </FileSettings> <DBSettings> <Server>.</Server> <Database>Northwind</Database> <BackupDB>0</BackupDB> </DBSettings> <EmailSettings> <SendEmails>1</SendEmails> <SMTPServer>smtp.gmail.com</SMTPServer> <SMTPPort>587</SMTPPort> <UseSsl>1</UseSsl> <SMTPUserName>JimInDE@GMail.com</SMTPUserName> <SMTPPassword>Someday13!</SMTPPassword> <From>JBrooks@DigitalTools.com</From> <To>JBrooks@DigitalTools.com</To> </EmailSettings> </C>

Typically you would have one of these per project. This might be called something like NorthwindDeployTest.xml. Notice that there is a "BaseDir" element and then other directories are a relative path under it. They have an attribute Parent="BaseDir". This can be removed if you want to use a different directory not under the BaseDir.

So our example where archives were moved to a shared drive would change:

<ArchiveDir Parent="BaseDir">Archive\</ArchiveDir>

To:

<ArchiveDir>\\MyShare\MyProject\Archives\</ArchiveDir>

Now all that we need is a PowerShell script to read this XML config file and then process it. One of the files under DeployUtils is Load-DeployConfig.ps1 and that is what it does.

Function Load { param ( [string] $ConfigFileName = 'C:\Work\Deploy\TestDeployConfig.xml', [string] $LogFileName ) Write-Host "this is it: $LogFileName" # Do not make $Config private, it is used by Log() Log ("Loading Config File $ConfigFileName") -LogFileName $LogFileName [xml]$Config = Get-Content $ConfigFileName Log 'Adding LogFileName to Config' $LogFileName $logFileNode = $Config.SelectSingleNode('//C/FileSettings/LogFileName') if(!$logFileNode) { $Config.c.FileSettings.AppendChild($Config.CreateElement("LogFileName")) } $Config.C.FileSettings.LogFileName = $LogFileName Write-Host "Loaded: $($Config.C.FileSettings.LogFileName)" $LogFileName #Now do not need the optional LogFileName since it defaults to $Config.C.FileSettings.LogFileName Log "" Log "Process Config File" #fill in the parent for all of the elements with a Parent attribute foreach($setting in $Config.SelectNodes("//*[@Parent]")) { Log "$($setting.InnerText) ==>" $parent = $Config.C.DirSettings.SelectSingleNode("//" + $setting.Parent) # if($parent -ne $null) $setting.InnerText = $parent.InnerText + $setting.InnerText Log "`t`t$($setting.InnerText)" } Log "" return $Config }

In your main PowerShell script you would use this to load the config like:

$Config = Load -ConfigFileName $ConfigFile -LogFileName $logFileName

Then you can use the variables like:

Move-Item -Path "$SQLFile" -Destination $Config.C.DirSettings.ArchiveDir.InnerText

Tags:

Development | OneScript | PowerShell | Continuous Integration | SQL

Database Continuous Integration – DeployTest

by JBrooks 12. February 2017 17:36

Part 2 of 4 - Commit Hook

I’ve created a simple Subversion post commit hook by creating a batch file called post-commit.bat and saving it to my …\SVN\hooks directory.  This batch file only has 1 line that calls a PowerShell script:

    

%SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe -noprofile -executionpolicy bypass -file C:\DeployTest\DeployUtils\CreateOneScriptCMD.ps1 "%1" "%2"

 

(This should all be on a single line.) If you run this to test you need to “Run as Administrator.”

The arguments passed to the batch file are just passed to the PowerShell script file. The %1 is the path to the Subversion repository and the %2 is the revision number just created by the commit.

 

The CreateOneScriptCMD.ps1 PowerShell script will look at the directories just committed and then create a text file with the appropriate OneScript Command arguments.  I then have a scheduled process that runs every few minutes that will see this new file and then do the real work. I did it this disconnected way so the developer doesn’t have to wait for the full process to complete when he does a commit.

Below is the CreateOneScriptCMD.ps1 script

param ( [Parameter(Mandatory=$true)] [string]$REPOS, [Parameter(Mandatory=$true)] [string]$REV ) Set-Location $PSScriptRoot cls $VerbosePreference = 'Continue' #hash table of paths to arguements found in OneScriptClient $Private:mapPathToArgs = @{"NorthwindDB/branches/" = "-s 1 -p 3 -fb -ra -b {0}" } $Private:index = -1 $Private:branchPath $Private:author $Private:args foreach($Private:dir in svnlook dirs-changed $REPOS --revision $REV) { Write-Verbose "Checking $dir" foreach($Private:key in $mapPathToArgs.keys) { $index = $dir.indexOf($key) Write-Verbose "index = $index" if($index -gt -1) { Write-Verbose "matched $key" # $author = svnlook author $REPOS --revision $REV # Write-Verbose "Author = $author" Write-Verbose "Matched $dir" $index = $index + $key.Length Write-Verbose "New index $index" $len = $dir.indexOf("/", $index) - $index Write-Verbose "Len = $len" $branchPath = $dir.Substring($index, $len) Write-Verbose "Branch is $branchPath" $args = $mapPathToArgs[$key] -f $branchPath break } } if($args) { break } } if($args) { $Private:dt = Get-Date -Format yyyyMMdd_HHmmss $Private:cmdFileName = "..\InCmds\OneScriptCmd_$dt.txt" $args | Out-File $cmdFileName -Append # $author | Out-File $CmdFileName -Append }

This will generate a file named something like: C:\DeployTest\InCmds\OneScriptCmd_20170212_215709.txt

And its content will be a single line containing the arguments to use when creating the release script from OneScriptCMD in the next step. Its content will look something like:

-s 1  -p 3 -fb -ra -b "3.2"

These are the same arguments created from within the OneScript Client Windows program. 

Tags: , , ,

Continuous Integration | Development | OneScript | PowerShell | SQL

Database Continuous Integration – DeployTest

by JBrooks 5. February 2017 06:23

Part 1 of 4 – Overview

For the next few blog posts I’m going to walk through having our new product OneScript creating a SQL release script automatically after a Subversion commit and then have a process that will automatically do the following:

  1. Restore a resent backup of our production database to our Deployment Test SQL Server instance.
  2. Apply the release script that OneScript just generated to the database.
  3. Determine if applying the script had any errors.
  4. Send an email to the developers with “Success” or “Failed” in the subject line. Have the process log as the body of the email and have the results of applying the SQL release script as an attachment.

This will tell the developers almost immediately if the SQL they just checked in causes an error in the release script.

You can download all of the completed code from here.

Most of this is done with PowerShell. If you don’t know PowerShell I highly encourage you to learn it. It is easy to learn and the next version of Windows will not have the command anymore - only PowerShell. I recommend the book below.

 

PowerShellBook_

Learn Windows PowerShell in a Month of Lunches

Tags:

Development | Continuous Integration | OneScript | SQL

Database Continuous Integration

by JBrooks 27. January 2017 16:23

 

We are pleased to announcing the release of our product OneScript!  It automates the building of a SQL release script from individual SQL change scripts. This allows you to include database changes as a part of your continuous integration process.

Who is this for? Developers that work on applications where the contents of the database needs to be preserved. It is simple to use and integrates seamlessly into the development process. Developers continue to use the tools and practices that they already use.


It works by following these steps:

  1. Developers script out each database change during a development cycle and check them into source control (Examples: SVN – Subversion, TFS – Team Foundation Server.)
  2. The change scripts are each checked into different folders based on the type of database object that they change. Example folder names are: Tables, Views, Store Procs, Static Data, etc.
  3. Within OneScript (one-time setup) each folder name is assigned a sort order. So something like Tables = 100, Functions = 200, Views = 300, Store Procs = 400, etc.
  4. For a release a filter is defined. The logic usually takes the form of “scripts changed since a given date/time” or “include every change since the branch was created” (if you use branches.)
  5. Optionally, at the individual file level you can override its normal sort order or filter.
  6. When it does a build it will filter, sort and combine all of the individual change scripts into a single release script. You can do a build from the command-line!


We built OneScript because we have tried many other approaches including:

  • Keep a single big change script in source control that the developers keep adding to for a release.
  • Give change script files a name that starts with the next available number (0044_, 0045_, etc.) so they are combined in order into single release script at build time.
  • Do a “diff” between Development and Production databases and then script out the differences. Then hand edit it to remove anything not wanted. Separately, add our static data scripts.
  • Email change scripts to the tech lead.


Each one of these approaches suffered from one or more of the following problems:

  • Quickly becomes unmanageable.
  • Last one in wins for changes to the same object.
  • Conflicts are hidden until release time.
  • Undoing a change is a manual and tedious process.
  • No history or auditability of changes.
  • Doesn’t lend itself to automation.

To learn more visit our site http://www.OneScript.com

 

Tags:

Development | SQL | OneScript

I Love Idea Paint

by JBrooks 21. September 2016 16:23

We just received our Idea Paint that we are going to put on one of the walls in our new workout room. This reminded me of when we painted the main conference room in the office building I worked in.  It was a big conference room with 2 little white boards that always had “Do Not Erase” written on them and a silly little paper easel. So I did some research and here is what we did.

Bought some Idea paint (still the best as of this writing, but warning – the smell will make you want to work somewhere else for about 4 days.)

IdeaPaint

 

Click here to see current pricing.  My wife, daughter and I took a Friday after work and that Saturday to paint the conference room walls, which had a nice cured corner.

Start

 

Truth is, my wife (the artist in the family) did most of the work.

JenWork

 

Once done, it was a giant white board!  Couldn’t wait to put up my process flow diagrams on it.

Finisheed

 

I didn’t tell anyone I was doing this mind you, I just did it. A few complaints about the smell into the next Tuesday. Everyone loved it to the point where people from our company were coming over from another building to have meetings in the room just to use the white board. My (consulting) company did give me a big gift card as a thank you.

A few weeks later it was full.

Used

I guess we will use it in the new weight room to track our workout progress…. hmmm.  We will see how that works out, I still highly recommend it for the office. Click here to see Idea Paint on Amazon.

Tags:

SQL Server 2005 to 2012 Differences for Developers

by JBrooks 14. November 2015 12:24

We are upgrading from SQL Server 2005 to 2012. There are a lot of features for developers that have been added over those 7 years.  Here are some you might find useful.


1. You can assign a value to a variable when you declare it.

DECLARE @Status varchar(50) = 'Success', @Today Date = Getdate()


2. You can insert multiple rows without needing to repeat the column list

INSERT INTO MyTable ( ID, LastName ) VALUES ( 100, 'Smith' ), ( 200, 'Jones' ), ( 300, 'Watson' )

(Up to 1,000 sets.)


3. Among the new data type there are DATE and TIME types.

No more having to use a DateTime when all that you really want is a date.

DECLARE @Today Date = Getdate()


4. IIF() function

SELECT IIF(1 > 10, 'TRUE', 'FALSE' )

4

You can use the IIF() with an IN

DECLARE @LastName Varchar(50) = 'Smith' SELECT IIF(@LastName IN ('Smith','Jones'), 'Likes SQL', 'Likes ORACLE' )

4B



5. CHOOSE() function

CHOOSE ( index, val_1, val_2 [, val_n ] ) SELECT CHOOSE(3,'Spring','Summer','Autumn','Winter')
5

6. CONCAT() function

CONCAT( string1, string2 [,stringN])

- All arguments are converted to a string first

- All nulls are converted to an empty string

DECLARE @FirstName VARCHAR(20) = 'Bob', @LastName VARCHAR(20), @Age int = 30 SELECT CONCAT(@FirstName, ' ', @LastName, ' is ', @Age)
6

7. You have the “C” like short cuts +=, -=, *=, /= and %=

DEClARE @x INT = 2, @y INT = 2 SET @x += @y SELECT @x as x, @y as y

7



8. Paging is now much easier with OFFSET and FETCH NEXT

DECLARE @PageNumber int = 4, @PageSize int = 40 SELECT WNumber, AuditTime, AuditTypeCode, ActivityCode FROM tb_AuditLog ORDER BY AuditTime DESC OFFSET (@PageNumber1) * @PageSize ROWS FETCH NEXT @PageSize ROWS ONLY;

9. FORMAT() function

SELECT FORMAT( GETDATE(), 'dd/MM/yyyy', 'en-US' ) AS 'DateTime Result', FORMAT(123456789,'###-##-####') AS 'Custom Number Result';
9


10. TRY_PARSE()and TRY_CONVERT() functions

IF TRY_PARSE('xxx' as DATETIME) IS NOT NULL SELECT 'TRY_PARSE: Conversion Successful' ELSE SELECT 'TRY_PARSE: Conversion Unsuccessful'
10 


11. You can now create your own types in the database (just like VARCHAR and INT are types.)  You can even define a table type.

CREATE TYPE IdsTable AS TABLE ( Id INT );

(Could have many more columns.)

You can then pass that type as a parameter to a stored procedure.

CREATE PROCEDURE GetSelectedAccounts @MyIds IdsTable READONLY AS SELECT * FROM Accounts WHERE AccountID IN (SELECT Id FROM @MyIds) GO

You can also pass the table in as a parameter from C#.

DataTable myIds = new DataTable(); myIds.Columns.Add("Id", typeof(Int32)); myIds.Rows.Add(100); myIds.Rows.Add(110); myIds.Rows.Add(2200); myIds.Rows.Add(40); dbCommand.Parameters.Add(new SqlParameter("@MyIds", myIds) { SqlDbType = SqlDbType.Structured }); DataSet ds = (DataSet) Db.ExecuteDataSet(dbCommand);

12. TRY CATCH THROW

BEGIN TRY DECLARE @Number int = 5 / 0; END TRY BEGIN CATCH -- Log the error info, then re-throw it INSERT INTO ErrorLog VALUES(SYSDATETIME(), ERROR_SEVERITY(), ERROR_MESSAGE()); THROW; END CATCH
When you re-throw it keeps the correct line number of the error.

This can also be used with transactions.

BEGIN TRY BEGIN TRANSACTION UPDATE #TestingTHROWCommand SET OvertimeAmount=95/OvertimeAmount COMMIT TRANSACTION END TRY BEGIN CATCH ROLLBACK TRANSACTION; THROW END CATCH

13. New virtual Windows FileTable commands in T-SQL lets you manage a folder like a table of documents, but still have external control over the contents

UPDATE C:\Docs\*.* SET ReadOnly = 1 WHERE Author = ‘Bob’ AND Created <20100101';

14. Reporting Services no longer requires IIS, it makes direct calls to HTTP.SYS.




15. Now have Oracle like sequences

CREATE SEQUENCE MySequence START WITH 1 INCREMENT BY 1; INSERT @Person (ID, FullName) VALUES (NEXT VALUE FOR MySequence, 'Steve Luna'), (NEXT VALUE FOR MySequence, 'John Peter'), (NEXT VALUE FOR MySequence, 'Jim Brooks');

Tags:

Development | Silverlight

Auto Feeding the Cat with Video Confirmation

by JBrooks 13. March 2015 11:38

From a schedule program I wrote on my laptop I wanted to turn on an electric wall outlet which would then turn on the cat’s auto feeder. The wife said she didn’t trust this to work, so she needed a video showing the cat being feed. 

So I combined all the Z-Wave tools I posted about here with some other C# libraries to capture sound and video.  I then wrote the code to have the video automatically uploaded to my server and then send my wife an email with a link to it.

Capture

You can see a sample video here. So here are the steps that it now follows:

1. The scheduler is scheduled to feed the cat 3 times during the day.

2. When it is time, the program turns on the laptop’s video and microphone.

3. The program sends a Z-Wave signal to the wall outlet to turn it on.

4. The program waits 30 seconds, then combines the video and sound.

5. The program uploads the new video to my web server.

6. It then sends out an email with a link to a page that will play the video.

I”m going to look into seeing if I can count the number of times it hears the sound of the food dinging the bowl as it falls. Then I could have the count in the subject of the email so it would be clear there was a problem if the count was low.

Also, I’m going to use my ChangeAlerts! program to send me a text message if a cat video isn’t uploaded within 5 minutes of its scheduled time. That is a hungry cat that cannot miss a meal!

Tags:

Programming Your Home

by JBrooks 28. February 2015 11:09

I wanted to do some home automation and I wanted to write the code myself (I’m a software developer.) I looked into it and decided to use Z-Wave technology.   I started by buying a USB Z-Stick for about $45 to allow my laptop to talk to the different devices. This can control up to 232 devices and each device rebroadcasts your command so it has a far range.

ZStick

Now I have this hanging off my laptop – not ideal but eventually I’ll be using a hidden old laptop in a closet to run this.

Next items needed were the Z-Wave devices themselves, you can get motion, light, and temperature sensors, etc.  And you can also get light switches and outlets that replace your normal ones.  So I installed the switch below so I can turn the front porch light on when it gets dark from a program running on my laptop. The switch is always in the neutral position, so if someone switches it by hand it goes back to the middle and it doesn’t move when changed through Z-Wave.

Switch Outlet

I have 2 of the outlets, one used by a light and one in the kitchen that I”m going to use to control a cat’s auto feeder (done – see here). You can buy all these items on Amazon.com.

Then I downloaded a library that would allow me to send commands to the Z-Stick from my programming language (C#). I used a library called OpenZWaveDotNet.  I had built a schedule for a different project, so I reused that code to schedule commands in this project. Then I’ll just have the program run forever…

Schedule

Everything is working so far, if you can think of another use for this leave me a comment.

Tags: