Jonathan's Blog
name

Jonathan's Blog

Mindful Leadership and Technology


Featured

Cloud Multi-Cloud Pricing

Cloud Pricing - A DevOps Feature Suggestion

Posted on .

I wrote about this in an earlier post.

Google will begin charging for in-use, public IP Addresses on January 1, 2020. Is this the much ballyhooed supply crunch for IP v4 addresses in it's early stages? Probably not.

My guess is that it is simply a reflection of the fact that Google sees this as a chance to recognize some revenue from a product/service that it has been giving away previously. And no one is going to switch cloud providers because of a few bucks they get charged for an in-use IP Address.

So, why write a blog about it?

Because it is December 18th and they still haven't published a SKU for it, which makes it hard for me to update my IP Address pricing tool on pricekite.io. This is mildly frustrating since I have some personal time this week in advance of the holidays and I was hoping to make this change.

For now, my information on pricekite remains accurate, in that I published the prices when Google published them. However, I would like the dynamic piece of this to work correctly. Right now it is simply hard-coded in the UI - not the most elegant solution.

In any case, this speaks to the generally opaque way that pricing and product SKUs work in the cloud world. SKUs are published, but the exact way in which they work, interact, and are used to charge your bill is sometimes hard to understand. Which SKUs apply? How are the discounts accounted for? Is it one SKU or many? how exactly will I be charged for what I am creating.

Each cloud provider has its own approach to this and actually Google has a decent tool here. But it still isn't perfect or always clear how it will work.

It would be great if the cloud providers would create a cloud pricing tool (similar to the way the Lambda permissions tool works) that showed you exactly which SKUs you were using, how discounts are likely to apply, and forecated cost with the ability to tweak scenarios.

Here is how the Lambda resource visualization works:

AWS_Lambda_Designer_smaller

To go a step futher, I'd like to see suggestions (based on DB structure or code being written) for how to reduce cost or make smarter choices. I believe this would be achievable, although might be difficult to do.

This visualization would show all the SKUs that would be used based on the current configuration and code of a particular service, and it would have some inputs to allow you to tweak scenarios and understand the cost of your choices.

This would allow developers to truly embody the DevOps mindset we all are striving for. How does my code choice impact the cost of this solution? As currently constructed, no cloud provider offers this type of functionality.

Featured

Random Test Harness

Posted on .

I created a basic random testing harness and put the code on GitHub:

https://github.com/jonathan-fries/random_testing

It is very, very basic as the ReadMe file points out. I will most likely do some additional work on it.

As it stands there is no obvious problem with either source of randomness or pseudo-randomness tested.

Both Math.random() and Random.org work fine. That is, when you ask for a bunch of random numbers in a range, they return a result that looks pretty darn random.

This does not explain behavior that I have seen from Math.random().

The next two things that I plan to do are:

  1. Introduce a static, non-variable delay (already in progress).
  2. Develop a basic UI for displaying the results.
  3. Add Crypto.getRandomValues() as another source of randomness

My reasoning for number 1 is that it is clear (so far) with this test harness that using it in a basic way, their is no pattern.

So the experiment would seem to dispel my hypothesis.

It is possible that I was simply seeing patterns where there are no patterns. The human brain is good at that.

But I really did see the same number (selected out of about 1200) come up 3 to 4 times within a set of 20 numbers. And this happened a lot.

So, perhaps it has to do with the way I was calling it. And if I was creating a pattern through manual testing, perhaps a standard delay could reproduce a similar situation. So, I will introduce a delay and see if that produces different results.

If nothing else, I can simply lay to rest my (perhaps human-induced) pattern recognition and sleep easier knowing that pseudo-random is really pretty random after all.

I don't really expect to see any pattern with Random.org. They were just a control and I had recently written code to integrate with them, so I included them for comparison.

Featured

Struggles with Math.random()

Posted on .

I've been using Math.random() in JavaScript (both in the browser and on the server) to generate random numbers for a couple of years and, for reasons I can't explain, I get a lot of duplicates.

Yes, I know, it's random. So you will get duplicates. But I was getting duplicates on the order of every 10th number, when selecting from a range of over 1000 possibilities. And this happened every time I looked at a sequence that long.

I Googled it exhaustively and found nothing meaningful.

It is true that Math.random() is not really random of course. It is pseudo-random - computers such as you and I have on our desks, or such as you can get provisioned (easily) in cloud data centers, cannot generate real random numbers.

But when you look at the articles on these pseoudo-random number generators (PRNGs) you see that it should not happen as often as it was happening to me.

Here are some useful articles on how they work:

https://hackernoon.com/how-does-javascripts-math-random-generate-random-numbers-ef0de6a20131

https://v8.dev/blog/math-random

They're a little techy, but so is this topic.

Today browsers offer Crypto.getRandomValues(), which is a good deal more secure, but I simply decided that I wasn't leaving this to pseudo-random chance, I went looking for where I could get real random numbers on the internet and I found two places.

The first I already knew about, random.org. It has been around for a good long while, and it has an API.

The second I found was a consortium, seemingly led by Cloudflare. I know Cloudflare as I use them on another site for DNS security, but I was unaware of their work around cryptography.

Here is a link to some information about Cloudflare's project 'League of Entropy' and also another cool project they have for generating their own (internal) random numbers from lava lamps:

League of Entropy
Lava Lamps

Using Cloudflare's solution was more complicated than what I wanted and I was not sure exactly how to adapt it for my needs, so I went with Random.org instead.

Random.org's API was very easy to use and their testing tool was very helpful. Here is a link to their request builder:

https://api.random.org/json-rpc/2/request-builder

Random.org uses atmospheric radio noise to generate real random numbers. That is, they have devices (in multiple countries) tuned between commercial radio frequencies and the background noise from these radios is used to generate a random signal that produces random numbers. Neat! Here is a link to a faq question about it that is very interesting:

https://www.random.org/faq/#Q1.4

The only thing that I did not like about it, is that I needed to round-trip to random.org to get a random number, and I had to figure out how to secure the license key. I did not want a lot of latency and I did not want to publish my license key to every browser in the world. Here is what my solution looked like:

  1. In browser, generate a psuedo-random number on page load. This is a fallback if other things break.
  2. In browser, make a background request to the website back-end for a real random number.
  3. On the server, if I have a real random number available, send it to the browser.
  4. On the server, if I don't have a real random number available, make a request for a buffer of random numbers from random.org. Generate a pseudo-random number and send it to the browser to avoid any additional latency.
  5. On the server, once I receive a buffer of random numbers store it.
  6. In the browser, make use of the random number.

I was a lot more worried about introducing latency than (and failures) than I was about whether or not I had a truly random number every single time, since this isn't exactly a high security system.

Nonetheless, I wanted a random number so I implemented the code above to give myself true randomness most of the time, with a fallback to pseudo-randomness to avoid excessive latency or failures. It was about 3 hours to get it all working and tested for edge cases such as "What happens if the web service fails?" or "What happens if my buffer counter for random numbers ends up in the negative for some reason?"

I still want to get to the bottom of why I was seeing so many duplicate numbers from Math.random(). I have to believe it was something to do with my code and not with Math.random() itself.

I am going to branch that effort into a different project to see if I can reproduce the behavior and get to the bottom of it.

Featured

Now Reading: The Happiness Advantage

Posted on .

I'm currently re-reading The Happiness Advantage by Shawn Achor.

It's a great book for thinking about happiness first, not as an outcome that we get later on after we're successful or rich or whatever.

In fact, according to the book, you stand a better chance of reaching those goals if you focus on being happiness first.

Happiness is a precursor to success, not the other way around.

The author has identified 7 principles that he believes will help you to be more happy. Here is a quick summary:

  1. The Happiness Effect - As mentioned above, happiness drives success, and not the other way around. But why is that true? It's true because happy people have greater ability to think creatively and process options. This is an evolutionary adaptation - as stress narrows are ability to fight, flight, or freeze - happiness opens us to the great variety of options that are available. In addition to discussing the theoretical underpinnings, this chapter offers a number of ways to give yourself the happiness advantage. These are tips and tricks to help lift your mood and make you more open to possibilities and success, as well as providing an antidote to stress. There are also tips for leaders on how to infuse your workplace with happiness.
  2. The Fulcrum and the Lever - By changing how we view the world, we change how we react to it. By having a more positive outlook, we will react with greater positivity, freeing us from negative reactions. By doing this we can have a much greater impact on the world around us. At first blush, this feels like a restatement or explanation of number 1, but it is ultimately more than that. While tips and tricks to add happiness help, and being happier can help us be more successful, point 2 is deeper. This is about a positive mindset as a fundamental alteration of ourselves, creating even greater possibilities. The author says it best:

Simply put, by changing the fulcrum of our mindset and lengthing the lever of possibility, we change is what is possible. It's not the weight of the world that determines what we can accomplish. It is our fulcrum and our lever.

  1. The Tetris Effect - The more time that we spend engaged in an activity the more our brain becomes wired to perform that activity. Tax accountants spend their days searching for errors in tax forms. As a result they become wired to search for errors in everything they do. Conversely, the more time we spend engaged in scanning for the positive, we gain access to three very important tools: happiness, gratitude, and optimism. Happiness we've already discussed. Gratitude is a reproducer of happiness in the now: the more we see things to be grateful for, the more grateful we become, the more we see things to be grateful for, etc. Optimism does the same for the future: the more we focus on happiness, the more we expect that trend to continue in the future.
  2. Falling Up - When we fail or suffer setbacks, falling down or staying in place are not the only option. For many people, failures and setbacks (and even trauma) can produce changes that allow you to not simply stay where you are, but take even bigger steps forward. This is the idea of falling up. Become better because of the setbacks in your life. Use your failures and losses, as learning experiences. See where those silver linings can take you. The author provides several techniques for how to think about these situations and gather positive momentum from them.
  3. The Zorro Circle - limiting yourself to small, narrow goals helps you stay in control and expand your ability to stay focused and not given to negative thoughts and helplessness. You may be familiar with this (if you have ever read The Seven Habits of Highly Effective People by Steven Covey) as Circle of Influence and Circle of Concern. It was a great principle then, and it still works now. This book provides some interesting additional scientific information about how this works in our brains and bodies, and it provides insightful tactics to to build up our circles of influence (or control) to make us more resilient.
  4. The 20 Second Rule - by reducing small barriers to change, we make it easier to develop new, healthy habits. It can be difficult to make changes to improve our health or well-being. Willpower alone is quite fallible and gets worn down the more we are asked to use it. By the end of a long day it can be difficult to work up the necessary grit to go to the gym. We can make this easier on ourselves by simply removing the small obstacles that unnecessarily sap our self-control energy - remove bookmarks to distracting sites, keep your boots at the ready so you still go outside in the winter, put that book you're meaning to read on the coffee table where it is easy to get you, or hide the remote control so that it is hard to turn on the TV.
  5. Social Investment - Building and maintaining social connections has important ramifications for our ability to handle stress and face challenging situations. When we have strong social connections we are more resilient and less likely to think of situations as stressful in the first place. Even brief encounters can be benefiical - a short encounter can still be high quality, resetting our respiratory system and reducing levels of cortisol (a stress hormone) in our bodies. Rewards for positive social interactions are very much wired into our brains. So here is one more reason (if you needed one) to maintain your social connections and build up your work and personal networks.

The last part of the book focuses on the ways that using the 7 principles can help us to spread the benefits at work, at home, and everywhere else in the world.

In addition to the overview information I've shared, the book has lots of detail on how things work as well as how to put it into action.

I'm personally a lot more interested in putting things in action, and don't usually have to be sold on the fact that it works, but it is there if you need it.

The individual steps and processes are easily worth the price of the book. After all, if even one of these makes a difference for you - whether in your personal happiness or your career satisfaction - what was that worth?

Featured

Cloud Computing Cloud Price Comparison

How Much are you Charging me Right Now?

Posted on .

Whenever I go and sign up for any particular cloud provider, or add a new service, I always have a moment of panic that sounds like this in my head:

"How much will this cost? How much is this already costing me, RIGHT NOW at this VERY MINUTE?"

Because the answer, if you aren't careful, could be a lot of money.

So, I am improving my skills with all of the cost dashboards, and I have even provisioned an AWS Organization in such a way as to track the cost of my AWS DeepRacer contest participants (not as simple as I thought it would be).

But it is difficult to understand what is turned on at any given second and what is being charged for.

It would be nice to have a Panic Report that would show you everything that was operational and let you see what the per hour or per second charge for that thing was.

Perhaps this runs counter to the notion of "don't make it hard for the customer to give you money" or "encourage them to sign up for everything and don't show the cost until the bill arrives", but I have to believe that these companies are not that shortsighted.


Side note: I am signing up for IBM Cloud and Oracle Cloud, in the interest of seeing if I can pull in the same information to pricekite from those folks that I did for the Google Cloud, Microsoft Azure, and AWS.

So far, I think the answer is 'No', but we'll see.

With Oracle, I am unable to even sign in at this juncture. Their sign in process is very much 'not like the others' and seems hyper-focused on enterprise.

I guess that makes sense for them, but it is frustrating. I guess I won't accidentally be giving them any money.

Featured

Cloud Price Comparison Cloud Cloud Computing Multi-Cloud Pricing

Pricekite.io Beta

Posted on .

The new version of Pricekite is up, which I am labeling 'beta'. It allows for interactive comparison of serverless compute pricing across Google Cloud, Amazon Web Services, and Microsoft Azure. It is located here. And the code is also on Github.

I'm pretty happy with the current results. You can repeat the results of the earlier blog, exactly. Meaning that my math was right. Though the fact that they Azure SKUs count things in 10s instead of 1s through me for a loop momentarily.

Here is the core research tools that were used to do this, along with the cloud providers private APIs:

The Google SKU explorer is very helpful, I wish all the providers would add such a feature.

You can also complete the extended analysis I mentioned in my blog notes. In order to run those scenarios you increase the number of functions, transactions, or any other parameter and see how the effect of discounts dissipates, with larger volumes.

My original scenario was:

2 Functions
12,960,000 Executions/Month
512 MB Function Memory
200 ms/execution

In this scenario (which is substantive) AWS is the clear winner, because of their discounts. However, when you increase the functions to 32, Azure becomes the less expensive option because of their lower (by a little bit) base pricing.