AWS Certified Developer – Wrap Up



“I hear you’ve got a saying: ‘Understanding is a three-edged sword.’ Well, we’ve got a saying too: ‘Put your money where your mouth is.'” – John Sheridan

Over the past few months, I’ve been taking prep courses and otherwise studying for the AWS Certified Developer exam.  The goal that I originally set was to have this finished by the end of summer.  While I’m a couple of weeks late, I’m proud to say that I passed!

I want to give an endorsement of A Cloud Guru’s coursework and practice exams in being crucial to my preparation and practice leading up to the exam.  While certifications are meant to essentially ratify existing knowledge, taking a timed and closed book exam can be very daunting for those of us who can generally check our phones if we’re not sure of specs or limits off the top of our head.

So, what’s next?  I’m not really sure other that to say that AWS and other cloud framework knowledge fits very well into the ServiceNow space with offerings like Discovery and Orchestration.  If there’s continued benefit from gaining knowledge in this space, then I’ll probably dive a little deeper.

Thanks to all for your support and encouragement during this journey.I invite you to share your own experiences and opinions on AWS certification or certifications in general.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

Resources:

Integrating Amazon Lex with ServiceNow

In a recent post, I covered creating chatbots in AWS Lex and created a simple bot for ordering jellybeans.  As promised, I took my bot and integrated it into a custom page in ServiceNow’s Service Portal.  While I’ve worked on several integrations in this space before, this was my first time integrating an AWS service into ServiceNow.

In order to integrate Lex, I needed to provision the following:

  • A ServiceNow instance (Helsinki or later)
  • An AWS Account with a Lex Bot (Created during my previous experiment)
  • An AWS User with the AmazonLexRunBotsOnly policy assigned

With these components in place, it was time to decide on an architecture.  I needed a UI for users to chat with Lex, client-side code to handle the text and transact with Lex, and finally some server-side code to initialize parameters for the service layer that are stored in system properties.

For the UI, I created a new Portal, Page and Widget to host all of the above.   The Javascript SDK for AWS is included via a CDN link.  I would have preferred to keep this piece server side, but the current offerings for the SDK are limited to either client-side or server-side with Node.js, which isn’t available for now.

What was critical to keep server side were the application credentials for the service account itself.  There are several properties that must be gathered from your AWS configuration and added as system properties in the ServiceNow platform.

With all this in place, it was time to test!  First by sending a message to Lex to invoke the Intent for ordering a jelly bean:

Then to follow the same flow of the Intent, telling the bot what flavor I want:

And finally confirm my order:

Depending on how you’ve configured your bot, you can either have Lex return parameters to ServiceNow or to pass them forward to an AWS Lambda or other endpoint.  It’s easy to see the potential here for using Lex to drive another entry point for your Service Catalog or even things like Orchestration.

If you’d like to take a look at the app, I’ve got it on my GitHub here.  I’m interested in any feedback you have as well as anything to share on ServiceNow app development or AWS Lex.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

AWS Certified Developer – Update

Since my last update, I’m proud to say that I’ve completed my course!  The major components of the second half of this course concerned Storage and Database implementations.  Here’s a brief rundown.

How is Storage handled on AWS?

The main storage offering on AWS is called S3.  This is basically bulk storage in AWS offered for a flat fee based on usage.  If you’d rather use a dynamic and shared storage that auto-scales, you’d set up a resource in Elastic File System.  The platform offers several different configurations based on availablity and performance expectations.   Versioning and replication of containers is supported.  Short and long term backups are covered by Glacier.  This is long term storage of snapshots of data at a lower fee, but binds you to a time commitment.  Those of us that have Disaster Recovery responsibilities can use this to implement Father->Grandfather backup strategies.

How are Databases implemented in AWS?

The conventional implementation of databases is by provisioning virtual DB instances.  You can choose your preferred framework, like MS-SQL or Oracle and then select the tier you need within that framework.   For the NoSQL crowd, there’s Dynamo DB which offers low latency databases for high traffic services and data analysis tools.  Certification note: The exam is mentioned by the course as being very heavy on Dynamo DB.  Calculating performance is a prominent item on the cert as you have a lot of fine tuning control with access. The key is to find that sweet spot where your bandwidth is sufficient without over provisioning.

While IAS, EC2 and S3 made up the lion’s share of the course, the remainder was short overviews of additional services such as:

  • Simple Queue Service + Simple Notification Service – Used as a clearinghouse to trigger shared events throughout your environments.
  • Simple Workflow Service – Used for management of back end processing in your API or Service Layer.
  • Cloud Formation – A framework for creating templates that provision predetermined purpose-built sets of AWS resources.
  • Elastic Beanstalk – Basically a wizard for provisioning auto-scaling application hosting environments that are immediately ready to run code.
  • Shared Responsibility Model – An overview of the demarcations between what integrity concerns the customer is responsible for as well as AWS.
  • Route 53 – More of an actual overview of DNS architecture rather than anything special about its AWS implementation.
  • Virtual Private Cloud (VPC) – This covers configuration of public and private zones of resources and defining rules for interoperation between them. Basically, taking everything we’ve learned and pulling it all together into something useful at the enterprise level. The analogy used for this is to think of a VPC as a logical data center.

Now that I’ve completed the course, it’s time to put my money where my mouth is and successfully pass the exam by the end of summer per my original commitment.  Wish me luck!

I’d love to hear any feedback on this post and invite you to share your own experiences and opinions on AWS, either your own projects or learning tracks. As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

I Want To Look Good Naked


“Anti-wrinkle cream there may be, but anti-fat-bastard cream there is not.” – The Full Monty

I spend quite a bit of time over at The Art Of Manliness.  Some great tips on health and style as well as a wealth of inspirational writing, mostly in the form of fanboying on Theodore Roosevelt.  In looking for my next big challenge, I’m drawing inspiration from this article on Rites of Passage.

Speaking of TR, I remember being interested in him from a very young age.  The main connection I made was that we both had asthma.  Not like I’d ever stack myself up beside TR.  Where he made a point of seeking out adversity and testing his shortcomings, I’d often take pride in taking the shortest and easiest path possible.  Sometimes this is a virtue, but not when it comes to taking care of the self.  As such, I need to settle a long standing debt.

As I’ve mentioned before, I’ve been at an unhealthy weight for my entire adult life (and longer).  Carrying around what is essentially another person has negatively impacted my activities and general quality of life as well as put me on multiple medications.  Rather than keep this as a ‘Someday’ kind of goal, it’s time to put together a plan and get to work.  By doing so, I intend to improve my own health, extend my life and improve my self-image and self-worth dramatically.

Here is my goal:

Success is : I record a weight of 178 or below on 7/27/2018
Failure is : I record a weight above 178 on 7/27/2018

Here’s my plan:

– Use LiveStrong – Record weight daily and keep a food diary
– I will keep my daily calorie intake 500 below my BMR
– I will supplement my calorie deficit by 500 a day with daily exercise
– The data for the above will be maintained publicly here as a widget

I’ve always known what was required but never really applied myself.  But if I follow this plan, I should be able to lose 100 pounds over the next year at a healthy 2 pound a week rate.  This is perfectly reasonable and attainable.

I’m inviting everyone to encourage me.  Even more so, I invite people to hinder me and try and fuck me up.  For my own health and well being, this may very well be the most difficult and the most important thing I’ve ever attempted.  Bring it on!

As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

Chatbots w/Amazon Lex

Recently, I attended a learning event put on by a local developer group: DevICT, which is a community for local developers in Wichita, KS.  The topic was Amazon Lex, a chat platform that you use to build your own chat bots.  Then these bots can respond to either text or voice from users and do useful things based on the outcomes of those interactions.

Setting up Lex is pretty straightforward.  After logging in to your AWS Console, you can navigate to the Lex product and it brings up your list of bots and controls to create new ones.  Once you’ve created a new bot, you’ll need to define Intents and Slots.  To put this in programmer parlance, if you think of your bot as an API framework or class, you could define Intents as methods and Slots as arguments.

For my bot’s first Intent, I took inspiration from a recent trip to the Jelly Belly jelly bean factory in California.  (The tour is free and great for the short people: i.e. kiddos)  So, I created an Intent called ‘IWantAJellyBean’ and seeded some phrases that would prompt the bot to start asking questions to fill in all the Slots.  This is the bridge between plain language and something the API can understand and the platforms language processing can imply the correct intent based on what you say or type.

Next, for my Slot, I created a short list of flavors.  This essentially acts as a type for fulfillment of the Intent.  When the Intent is invoked, a prompt that you configure in the framework adds a prompt to the chat to gather additional information from the customer.

Once all the information is gathered via the chat, the bot prompts one more time to confirm fulfillment of the order.  The confirmation is defined by the developer and you can determine the outcome based on the customer’s response.  Then, you can choose to trigger an event based on what happens as a result of the chat.  You can even leverage an AWS Lambda to take the results of the chat and trigger downstream activity.

Next comes the fun part: Talking to your bot!  This is mostly a debugging step but I was really impressed by how I was able to just use voice to walk through the test.  It’s very easy to see the potential of this platform to automate interactions with individuals and trigger requests or other business logic in your infrastructure.

I want to thank the individuals at DevICT for a fun and interesting learning experience.  If you happen to be in the Wichita area, I would highly suggest stopping by for their events, which can be found on Meetup or Facebook.

I’m interested in your experiences with Lex and AWS in general.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

Changing Jobs

“… we live in a real world, where the line between prosperity and destitution can be as thin as the bankruptcy of Lehman Brothers or a factory closure.” – In Praise of Lando

A funny thing happened to me on the way to this summer, I left my job of 11 years to pursue a new opportunity.   To be honest, I was content and challenged in my previous role and on a pretty solid path.  But sometimes when an opportunity presents itself, you have to weigh the options and take a chance.  The key question is, are any of us really prepared to honestly explore a new opportunity in good faith?  How many of us are really thinking beyond the here and now?

In principle, we should always be assessing our proficiencies and interests on a regular basis.  All too frequently, we are just trying to make it to the end of our to-do list so we can unplug.  Even worse, sometimes we’re just trying not to lose ground or hoping that an unforeseen crisis doesn’t force our hand.  Working hard and executing might get you through the short term, but it isn’t enough.  Taking a step back to ensure you’re doing the right work is crucial to long term growth and success. This doesn’t just apply to individuals, but to organizations of any scale.

When Seth Godin warned us to “Dig your well before you’re thirsty,” he’s challenging our inherent complacency and tendency to coast and accept what’s handed to us.  Life is replete with black swan moments.  Do you seek them out?  Do you dread them?  I’m starting to think the main question is, what might be holding you back from making any change at all?  It’s one thing to miss or pass on an opportunity.  It’s something else entirely to be so overextended and brittle that the slightest disruption brings everything down.

When I first started out writing BASIC and soldering electronics, the World Wide Web wasn’t even a thing.  Since then, entire technologies have been born and become obsolete. What do you do to keep current and be ready for the next thing?

As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

When Wisdom Is Your Dump Stat

And now for something completely different….

In a previous post, I’d mentioned that I’ve struggled with health and fitness for most of my adult life. In response to a co-worker’s call to action, my fat self signed up for the 2017 Wichita Gladiator Dash! It’s a 5k obstacle course in the local county park which includes climbing hills, fording lakes and creeks, falling off things into waist deep mud along with other assorted playground activities.

Now that you’ve (hopefully) stopped laughing, I’m happy to say I actually completed the course. And since I’ve never done anything like this before, I’m allowed to claim I achieved a personal best!

One thing that people in sedentary jobs like mine must watch out for is our health. And it’s a great thing that fitness and good health has seen a resurgence in popularity in recent years. However, we must contrast that with America’s love affair with bacon, food challenges and ridiculous monstrosities in culinary fare such as the Quadruple Bypass Burger making headlines. (Where’s THAT race??)

In either event, I’m proud to have made a choice to try something challenging in the name my own good health and the entertainment of others (like my wife). I’d love to hear any of your stories about your own health and fitness goals.

As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

Knowledge 17 – A Look Back

What is the purpose of a conference?  

To some it’s an opportunity to get out of the daily grind for a few days and see some cool new tools and make some new connections.  Ultimately, it’s an opportunity to ensure that you’re heading in the right direction and to come back armed with knowledge to guide conversations about existing work streams and inspire new ones.

Knowledge is an annual conference put together by ServiceNow intended for any individuals that use or build on the platform.  My involvement this year started with an invitation from a colleague to help with a presentation he was working on for this year’s conference.  After a live rehearsal at our local SNow User Group, I was very excited to attend and both learn from and share with others in the field.

Sunday, May 7th – Monday, May 8th

My conference journey began with a 2-day pre-conference seminar on Business Application Development.  While not highly technical, the course provided valuable information to guide decision making and recommendations for applications on the platform.  Additionally, they provided strategies for identifying and solving pain points or broken windows on the platform in general.

Other guidelines for improving our approaches to development touched on defining measures, such as return on investment or solving common business challenges by hiding or streamlining complexity.  Additionally, we were reminded about how important it is to establish and drive the narrative of solutions built on the platform.  Essentially, to tell our customer base what we’re doing and why and to solicit feedback and adapt our approaches accordingly.

On the development side, we spent some time on effective user story writing as well as best practices around extending the existing core modules and organizing any customizations or new features we implement.   We were introduced to some new features for upcoming releases such as UI/UX functionality and improvements to automated testing and the resurrection of a previously deprecated debugger which will be an extremely useful tool for developers.

For architecture, we went over some basic tasks like table creation and decision making around when to build new and when to extend existing tables.  Also,  we were coached on how important it is to define the scope of a new feature before beginning work.  Applications should have a clear purpose that can be reconciled to the measures of business value and customer feedback covered previously.  You might enjoy building something cool in a new way, but that’s never a guarantee that it’ll see any adoption beyond curiosity or superficial interest.

Tuesday, May 9th

Tuesday marked the official beginning of the conference and the ServiceNow CEO’s Keynote did not disappoint.  A record setting 15,000+ attendees were challenged to improve the customer user experience, protect the value that we’ve created and continually work to reclaim wasted time and resources that can be better utilized elsewhere.  Members of the ServiceNow community were informed of additional efforts to continue to address the gender gap and reminded to challenge preconceptions about career paths for anyone and everyone.

My first class of the day was Angular2 applications for the ServiceNow Platform.  While I’ve been working in Angular for a couple of years now, I had zero experience in Angular2 or deploying a ServiceNow application from a GitHub repo.  This session gave us a walkthrough on staging and testing an Angular app locally using NodeJS, publishing to GitHub and then directly installing the app from a hyperlink on GitHub itself.  This spawned quite a few ideas for myself around better organization of code and sharing that code with others.

Next, I attended a course on Testing Inbound REST APIs.  This is a possible feature for Jakarta that will allow developers to use SNows Automated Testing Framework to simulate HTTP calls against tables and services they build and establish expectations around functionality and behavior on those calls.

My labs were done for the day, so I attended a business oriented breakout called Enabling Enterprise Architecture Decisions Through the ServiceNow Platform.  The session overed ideas and justifications for consolidating existing services and data into the platform to eliminate wasteful and repetitive practices throughout the enterprise.  The idea is that, by removing many of the seams between various layers of stand-alone solutions, ServiceNow simplifies the conversation around enterprise architecture by assembling it into a unified platform.  Additionally, the platform can allow stakeholders to focus on managing and prioritizing services rather than keeping track of nodes and their dependencies separately.

To cap off the day, we had our own presentation on Service Portal. Our topic specifically covered challenges and lessons learned when integrating 3rd party platforms into the Portal itself and providing a seamless and positive user experience.  This was my first time actually presenting at a global conference so it was a bit nerve-wracking.  But, our presentation was well received with excellent Q&A from our audience.  Thanks to all who attended!

Wednesday, May 10th

I started off the day with an interesting session on certifying applications for the ServiceNow Store.  While I haven’t personally built any public applications, the standards SNow establishes for their store can easily inform standards for internal applications as well.  They covered a Top 10 list of common mistakes made when developing, mostly around roles and security.  Additionally, we were reminded of their built in module for certifying applications, which can be useful for spot checking applications or other features in progress or already in the wild.

Next was a breakout entitled Defining your App Development Methodology for ServiceNow.  This was basically an outline of steps to follow when proposing or accepting new work.  Questions around demand and identifying key stakeholders and sponsors and maintaining their interest throughout the process.  Also, there was a reminder that new features always include a subsequent cost of support and maintenance throughout the life of that feature.  One last thing: ‘Have a Testing Zealot!’  Not my term, but I’m using it anyway.

My first lab of the day was on Advanced Service Portal Widget Techniques.  We covered several implementations of a list view in the portal incorporating conventional server-side GlideRecord calls and client-side API calls.  Combined with configuration level constants like table name, we were given a template for a reusable widget that can be easily cloned and tweaked for multiple uses.

Lastly, I attended a session on Analytics and Machine Learning.  This is an equally arcane and fascinating topic for me and it will be interesting how application of AI will inform the ServiceNow platform going forward.  We were introduced to automated Virtual Agents that can act as first responders to customers and learn from previous customer engagements to guide and improve future sessions.  Additionally, machine learning can be applied to data within our existing systems to derive additional meaning from data points that human analysts might miss.

Thursday, May 11th

The last day of the conference definitely finished on a high note with the CreatorCon Keynote.  Developers and other creators were reminded that expectations are constantly changing and growing.  The sheer volume of data and the speed at which the meaning of that data needs to be communicated will only continue to grow.  We should not only be thinking of automation for customers, but also every edge we can gain in our own processes to increase our velocity without compromising effectiveness and overall quality.  Tools such as native automated testing and integrated debugging can only help.

Our last lab and session of the conference was Managing Team Workload and Collaboration with VTB and Connect.  Most of the information on Virtual Task Boards was already familiar territory for me but it was neat to see some new improvements and features, including Connect integration for live collaboration among team members.

Summary

Needless to say, this past week has been a whirlwind of people and concepts and generally drinking from a fire hose of information with maybe a small respite now and then to let those ideas breathe.  What’s most rewarding for me is to see so many people in one place excited and passionate about what they do.  Our work has a soul.  It’s necessary to recognize the value in what we do and continually strive to improve.  Conferences are good for a spot check (sanity check?) on where we’re at and where we’re headed.  I highly recommend them.

I’d love to hear any feedback on this post, especially if you attended Knowledge 17 and would like to discuss your own experiences.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

AWS Certified Developer – Progress So Far

In my post about certifications, I mentioned starting a certification course regarding Amazon Web Services development.  Bit by bit, I’ve been making progress through the course itself.  The structure is pretty straight forward.  It begins with an overview of what AWS is overall and what you should expect to get out of the course.  The idea primarily is to prep you for the basic Associates certification, so they’re going for breadth as opposed to depth.  That’s fine with me!

First things first – Identity Access Management

Some of the first things to consider when building a new app is who your audience is and what they need to be doing.  Trying to shoehorn a security or roles model after you’re under way is just asking for it.  Therefore, the course starts us off at the beginning by showing us how to build roles and use them programatically through the CLI.

One of my more interesting takeaways from this was locking down your root access using Multi Factor Authentication.  This involves creating a key object on AWS and mapping it to an authenticator app on my phone.  The premise here is that no one should be able to get root in your environment based on a simple password.  It’s a good habit to get into before you entrust important data or business logic to the cloud.

What is EC2?

Once you have your roles in place, it’s time to provision resources.  EC2, or Elastic Compute Cloud, is where you can provision various types of virtual compute hosts.  There are options based on conventional questions like number of processors and memory, or you can request role based hosts optimized for graphics or high memory or transaction-intensive needs.

This is one of the longest sections of the course, but an important term that you might hear a lot about is something called Lambda.

What is Lambda?

Lambda is Amazon’s event driven code solution where you have a function or service waiting on a call and it responds only as needed.  So, you can think of it as a sort of headless API where all you have to worry about is your function or service itself and none of even the typical PaaS concerns such as host or middleware setup or general availability.

One example of Lambda pointed out by the course is Amazon’s Alexa service which is available on their home devices, such as the Echo.  I’ve only tinkered with Lambda once or twice, but the potential is very exciting.  I’m looking forward to a deeper dive at a later date.

At this point, I’m only about a third of the way through but will be spending more time on it during May and June with a goal of taking my certification over the summer.  For those who are interested, the course is put together by a company called A Cloud Guru.  They have several other courses and tracks for AWS available at their website and at Udemy, where I’m taking my own course.

I’d love to hear any feedback on this post and invite you to share your own experiences and opinions on AWS, either your own projects or learning tracks.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!

Azure Follow Up – Closing It Down

It’s been a little while since I wrote about Azure.  Early on, I wasn’t sure how much I’d be focusing on using that outside of my day to day and the truth is, I haven’t touched it much since last year.  Nothing against the service, I just haven’t had a need for dedicated hosting in a while.
 
On that note, I will be shutting down my Azure resources as of today.  I’m in no rush to port what I’ve built so far to AWS, but my projects are on Github for any who are curious.  I may revisit them at a later date.
 
Even though I won’t be actively working in the Azure space for the foreseeable future, I’m still open to discussion and collaboration on the platform.  As always, if you have any questions or comments, please feel free to add them here or address them to john@benedettitech.com.

Thanks for looking in!