Our Blog Page Page Icon  

Our Blog.

Humans writing words that span topics from support to design to content management and beyond!

 

Audio Compression (Part 2)

The way audio is stored on a CD (in a .wav file) is uncompressed. It is in a constant bitrate and always takes the same amount of storage space regardless of what audio is actually being stored. This is the reason that when you buy writable CDs they always state number of minutes of audio you can store on them (80 minutes for a typical 700MB disc). This means that if you have a 3 minute song file and a file that is just 3 minutes of silence they would take up the exact same amount of storage space (30 Megabytes).

 

Now let’s go back in time a bit to the turn of the century when ‘high-speed’ Internet wasn’t really something that everyone had, even if you did it was well under 1Mb/s (keep in mind that’s bits per second not bytes, a bit is 1/8 of a byte). Dial-up was the go-to for the majority of Internet users (at best 56Kb/s) so every bit of data you could save was crucial.

 

In 1999 the infamous Napster was released, the service that really kicked-off the music sharing era, but with a 3 minute song being a whopping 30 Megabytes a 56Kb connection would take at least 75 minutes while completely saturating your connection. That’s a long time for a single song.

 

Winamp + Napster (The go-to back in the day)

 

Now the .mp3 which was formally introduced back in 1993 now has a chance to spread its wings and begin its complete takeover as the de-facto standard in audio storage. The magic behind the .mp3 (and audio compression in general) is that it takes that original uncompressed file and using a complex algorithm, removes ‘unnecessary’ details that your brain wouldn’t really notice were missing.

 

 

Originally most files were encoded at 128Kb/s and at that level you could actually get a decent sounding file at a tiny fraction of the size of the original. Now that Internet speeds have drastically increased you would typically encode at 320Kb/s which is the highest bitrate that you can record an .mp3 at. Beyond that you run into the law of diminishing returns and it just isn’t worth it.

 

Now there is also something called ‘lossless’ compression. This is usually stored in the previously mentioned .flac (Free Lossless Audio Codec) format. A losslessly compressed file will still retain the exact quality of the original file but at a smaller file size. It does this by keeping all the sounds that exist in the recording but throwing away all the frequency’s that aren’t used. So, for example, if you record something like an audiobook where it is only the human voice, it would throw away everything outside of (roughly) 80-255Hz which is the typical frequency of human speech.

 

There is also another type of audio compression which is compression of dynamic range, but that’s a topic for another post…

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Audio Compression (Part 1)

Audio has been stored on many mediums over the years. From analog methods like Vinyl, 8-Track, Cassette to digital ones like CD, MP3 and FLAC. Vinyl is a bit of an exception in the general newer-is-better progression that tends to come with technological advancements. The way that audio is recorded onto a Vinyl record is by etching an extremely small groove into the surface of the disc that is a physical representation of the sound waveform. The groove is recorded cylindrically, typically from the outside of the disc inward. You then play the recorded sound back on a turntable where you place a needle inside the groove and when the disc is rotated at the right speed the needle running along the groove picks up all the microscopic impressions and are turned into sound.

 

Turntable Needle "In the Groove"

 

The advantage of Vinyl is it is a literal representation of the sound, meaning that when you play it back you are getting as close as possible to the original sound as possible. It can be described as “warmer” and more natural sounding than its digital counterpart the CD. It of course has its downsides like the fact that you are scraping a metal needle against the play surface you physically degrade the record every time you listen to it. Regardless of downsides it is the only analog medium that has stuck around purely because of the way it sounds.

 

Digital media changed things up in that it samples the audio being recorded at a certain bitrate and frequency (16-bit and 44.1Khz is the standard for a CD). This means that it isn’t a constant stream of information being recorded like in analog, it is just checking for changes very very quickly and storing the sample which is then stitched together to form a digital audio file. Now that the information is digital it can be transferred, copied and modified.

The biggest change from going from analog to digital can be seen when you look at the waveform:

 

Analog vs Digital Waveform

 

See how the digital signal looks like a staircase? That's the sampling I was referring to. Now this makes it look like a huge difference, but the sampling is being done as I said "very very quickly" so your brain shouldn't know the difference. Now that is only the first step towards creating something like an MP3, as a CD is stored in an uncompressed state. This isn't a problem when you are playing back the disc itself but what about when you want to take all the songs off that CD and save them to your computer or play it on your phone? For that we need compression... which I will explain in Part 2.

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Google Chromecast Desktop Casting

Google Chrome version 53 introduced native casting support to the Google Chromecast. This ability has been available for a while using an extension but now that everyone with an up-to-date version of Chrome has it lets take a look at an interesting feature that it includes.

 

For the uninitiated, the Google Chromecast is a little HDMI dongle that you plug into your TV that connects to your Wi-Fi network and allows you to stream content to your television from your Phone/Tablet/Laptop/Desktop. You can stream things like YouTube, Netflix and even audio-only like Google Play Music or Spotify.

Google Chromecast 2015

 

When you use Chrome to stream your content you can stream whole tab to your TV. Which means you can stream any website (including sound) to your big screen, great for sites that don’t support the Google cast function. Now with the new version of Chrome (previously in beta from the old Chromecast extension) you can stream your entire desktop.

 

The Cast Option in Chrome's Dropdown Menu

 

Streaming your whole desktop lets you playback content that doesn’t have to exist on the web, like locally downloaded movies/TV shows. It even syncs the audio delay according to network performance. The streaming quality is subject to several factors but is heavily dependent on your Wi-Fi network performance, so the better your network the better the quality (including framerate and compression artifacting).

 

Cast Desktop Option when you select a source

 

Ideally you should be using a strong 802.11ac Router/Access Point and the most recent Chromecast (which supports 802.11ac Wi-Fi). There are rumors of a new Chromecast coming out next month (October 2016) which would be the third generation and is allegedly bringing 4k support along with it.

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

My Frankenstein Kitchen Audio System

So like many I have tried installing a Bluetooth speaker in the kitchen to listen to some tunes while doing dishes or the like. Also like many others I am let down buy the terrible sound quality, low volume and flakiness that is Bluetooth (sometimes). So as a project for myself I decided to whip up my own wireless streaming audio system.

 

To start I used the 2 rear-channel speakers from my 5.1 home theatre setup (which currently is setup in a 2.1 configuration) which were collecting dust under our TV stand. I cleaned them off and took the speaker wire that was still connected to the also-collecting-dust center channel speaker and cut/stripped it evenly so I had a pair for each speaker.

 

Next to power them I needed to get an amp. After a bit of research on what the power requirements on the rear-channel speakers were and some Amazon-ing to find an amp that I liked I ordered the LEPY-2024A+ Digital Amp .

 

Finally, I needed a way to get my music to the speakers. I had this piece in mind from the start which is a Chromecast Audio from Google. You plug it into power and into your audio output (the audio amp) and then it connects to  your home Wi-Fi network and shows this little Cast icon  in compatible apps like Spotify and Google Play Music. You just hit the button and choose where you want your music to play and away you go. It also works from inside Google Chrome on desktop in both Spotify and Google Play Music as well.

 

They sound great and are really easy to begin streaming to from any smartphone, tablet or desktop/laptop. Also a nice perk of the Chromecast Audio is it handles the streaming itself, it isn't pulling anything from your device so you don't need to even be connected to Wi-Fi after you have initiated the stream.

 

Not to shabby.

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

The Road to UHD: 1440p (Part 2)

So who does this matter to? Well, the average person probably doesn’t care, the people who do care range between computer hardware enthusiasts/gamers to prosumer/professionals. I think I technically fit all 4 of those categories, but the hardware enthusiast/gamer is probably the one I most resonate with. Gaming with a higher resolution screen is just better. Everything is sharper, you can pick up details of things that are farther away and you can see more fine detail when you are up close as well.

 

This sounds great, let’s just all upgrade to UHD monitors and everything will be great right? Well, the problem when it comes to gaming at a higher resolution (especially UHD) is that your computer now has to render/process an image that is 4x larger. This poses a problem as computer graphics companies NVidia/AMD have only recently been targeting that magic UHD resolution for their top-tier GPUs ("Graphics Processing Unit": the part that renders all the graphics/images that show up on your screen).

 

Now enter 1440p. This once professional-only resolution is now the perfect stepping-stone between HD and UHD. It’s even perfectly spaced halfway between HD and UHD as it is exactly 50% more width and 50% more height of 1080p, giving a nice 2x increase in pixels. With UHD displays hitting the scene it has also pushed 1440p displays way down in price making them much more affordable than going all the way to UHD.

 

See the framerate hit going from 1080p (46fps) to 1440p (32fps)

 

Now as I said this is, for the moment, only really going to be sought after by the aforementioned groups of people and won't really hit mass appeal for a while. But it's good to know what is coming down the road on the display front. You will probably see UHD pushed a lot more in the TV sector before you see it beginning to replace 1080P as the standard desktop computer resolution. But it's only a matter of time.

 

P.S I didn't mention Ultrawide monitors in this post but that may be a topic of another post in the future :)

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

The Road to UHD: 1440p (Part 1)

Ultra High Definition (UHD) or more colloquially known as 4K (though technically incorrect as 4K is defined as a resolution of 4096 x 2160) is one of the next big things coming to displays spanning from smartphones all the way to Television sets. It is the successor to HD or “1080p” and weighs in at a resolution of 3840 x 2160 or four times the resolution of HD. It has been a long time coming as 1080p has been the highest resolution for the majority of displays for well over a decade.

While 1080p may have been what the majority of people considered the highest resolution available, there is another higher resolution that hasn’t had the same kind of attention (mainly due to the fact that the highest resolution content available is running at 1080p). QHD or 1440p is an in-between resolution that runs at 2160 x 1440, which is exactly double the resolution of 1080p and half that of UHD/4K. It has been available as a display resolution long before the rise of UHD.

 

The problem with 1440p is that up until recently it has been largely restricted to computer monitors at or around $1000 and targeted at professional usage. With the new push for UHD, 1440p monitors have fallen in price to fit between HD and UHD screens. So why is this important? It’s all about pixel density.

Pixel density is the number of individual pixels that are squished into a certain set space, usually in Dots Per Inch or DPI. The closer together the pixels get, the sharper the perceived image becomes. If you think about reading text on a typical 1080p computer monitor vs something like a magazine it becomes very clear (pun accidentally intended) that the magazine is sharper and easier to read. This is because print media usually has a pixel density of at least 300DPI, while a computer display is usually around 72DPI.

 

So who does this matter to? Well, for that I will have to leave until Part 2 of my post in my usual fashion.

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Wi-Fi Part 2

To understand why wavelength and bandwidth are linked we should look at how data is actually transferred over the air. Thinking back to the days of Morse code we know that all we needed to do was disrupt the signal and someone on the other end would figure out what was being said. What about sending audio over the air? Now that we need to send more data we have to start modifying different parts of the signal. This is done through Amplitude Modulation, Frequency Modulation (and lesser known Phase Modulation). We more typically refer to these by their acronyms AM and FM.

 

The EM spectrum moves in waves. It has peaks and valleys. We can alter the height of the peaks (amplitude), the distance between them (frequency) or add a delay or lag to each individual wave (phase). 

Amplitude and Frequency Modulation

 

Phase Modulation

These are great for sending an analogue signal (like audio from a radio station to your car stereo) but what about sending digital information like Wi-Fi? Digital information is stored in binary as a series of ones and zeros, we can use these existing methods like AM by defining what height of wave translates to a one and what translates to a zero. Unfortunately we run into the aforementioned issue of bandwidth. Sending digital information requires significantly more bandwidth than is available in just these methods. There are many more complex methods that go into sending data over Wi-Fi but I won’t get into them here.

 

Unfortunately this is where the low wavelength we've been using hits the bandwidth wall. Because the waves are so long we can't send enough ones and zeros quickly enough to be useful. Now we need to shorten the wavelength and crank up the frequency. Older Wi-Fi versions used a frequency of 2.4 Gigahertz, mainly because it was unregulated. Using 2.4 GHz meant sharing a frequency that many other devices we're already using (like your microwave oven) and ended up being too cluttered with other signals that it frequently caused an unreliable connection.

 

Current versions of Wi-Fi use a frequency of 5 Gigahertz to move away from the cluttered overused 2.4GHz frequency. This meant a cleaner signal and a big boost in available bandwidth. Unfortunately shortening the wavelength again meant sacrificing its ability to pass through walls and effectively shortening the distance the signal can travel indoors.

view all comments (1) add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Wi-Fi Part 1

In the late 19th century we began manipulating the electromagnetic spectrum in order to transfer information across short and long distances. In the beginning all we knew how to do was disrupt the EM spectrum, creating simple audible beeps. Using a combination of long and short disruptions/beeps, a person could convert the pattern into letters and numbers (Morse code).

 

While the use of this new communication method allowed for people to communicate across long distances, it was really more of a glorified smoke signal. Beeps aren’t a very natural communication method. Luckily we figured out how to send audio over the same type of radio signal by making more fine-grained disruptions to the signal.

In the latter half of the 20th century there was need to send even more complex data over radio signals. By making even more complex changes we discovered how to send binary digital information across radio signals.

 

So how the heck does this all work? Radio signals are part of the electromagnetic spectrum which is essentially light. Now, it isn’t the visible light we typically think of. Take a look at this diagram:

 

electromagnetic spectrum

 

Look at the red line that shows the range of wavelengths that make up the spectrum. Radio signals use longer waves such as 30Hz, as they pass through objects like buildings well and can travel great distances. But wait, why doesn’t my wireless router have range much wider than the outside of my house? When I look at the box for my wireless router it says something like 2.4GHz or 5GHz, that’s way higher than the frequency used by radio. That is true, and the reason is an issue with bandwidth.... and I'll continue this in my next post, WiFI Part 2.

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Where's my OLED Computer Monitor? (Part 2)

Now if you’ve read part 1 of this post from last month (here) you understand that OLED displays are awesome, so why don’t we have OLED computer displays? Well I think the primary reason is the same one that caused plasma displays to lose popularity (even though they also have awesome contrast) and it comes down to one thing: burn-in (cost is a factor as well but that always comes down in time)

 

Burn-in (or image retention) is a problem whereby leaving a static image on the screen for a certain length of time, that image will be partially visible even after the screen changes to show something else.

 

Here is a good example of burn-in occurring on an OLED display on the Motorola 360 Smartwatch
Here is a good example of burn-in occurring on an OLED display on the Motorola 360 Smartwatch

 

Burn-in is one of the main reasons that you see people buying LCD televisions over Plasma, because fear of ruining their display if for example they leave a movie paused for a long time, or have a news channel on where there are big graphic elements that don’t change. People would be swayed to LCDs even though Plasma has significantly better contrast and lower cost for the same size LCD counterpart.

 

OLED displays like plasma are indeed more susceptible to burn-in. So being used in a desktop/mobile computer scenario with static unchanging UI elements (e.g. taskbar) doesn’t seem to make as much sense. Manufactures remember what happened to Plasma and are advancing the technology in other devices like smartphones/watches and now high-end televisions. 

 

What is different this time around is that intelligent engineering can greatly overcome this issue, and is improving every generation. Some people have now even started using those OLED televisions in place of a computer monitor even though they are significantly larger and eat up a lot of desk space.

 

Let’s not forget about CRTs. Remember those? Remember that setting in display options that is probably turned off called Screensaver? That was created because CRTs are also susceptible to burn-in. I think that we are ready to see OLED panels on the desktop. Burn-in is no match for the Windows 95 3D maze screensaver!

 

Windows 95 3D maze screensaver

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Where's my OLED Computer Monitor? (Part 1)

It’s been a few years now that new TV’s are bring released that use OLED (Organic Light Emitting Diode) panels instead of the common LCD (Liquid Crystal Display) tech that electronics have been using forever. Smart Phones are also more frequently being released with OLED displays (Galaxy S phones from Samsung are a good example). Now why is this a big deal? It’s all about one thing: Contrast.

 

The display that you are probably reading this on is most likely using an LCD panel. The TLDR of how LCD’s work is like this: starting from the back there is the appropriately named backlight which is the only thing in the display that actually emits light. That light passes through a series of polarising filters and the LCD panel itself which contains millions of tiny sub-pixels, one for Red, Green and Blue (RGB). The subpixels are combined to create any of ~16 million different colours (256 or 28 for each colour).

 

The problem is that the backlight is turned on and stays at the same brightness level regardless of what is being displayed. So if you are watching a very dark movie or a bright and happy cartoon, the display is putting out the same amount of light. This leads to pretty terrible contrast.  If you have the screen displaying black and turn off all the lights, the room will be lit from the light output from the screen.

 

 

Now OLED displays on the other hand have fantastic contrast. The big reason for their strong contrast is that the OLED panel itself emits light (even says so in its name), that means there is no backlight. No backlight means that every pixel on the display can output its own different amount of light. This also awards them another big contrast boost because each pixel can completely stop emitting light. If you tried the dark room experiment above with an OLED display, you would actually be in the dark.

 

Now that we know OLED displays are awesome, why aren't we seeing desktop computer monitors (or laptops for that matter) being released that use them? In Part 2 I will get into why we haven’t really seen a consumer computer monitor use an OLED display.

 

P.S. It’s worth mentioning that “LED Monitors” are not the same thing. Manufactures in their infinite wisdom decided that they were going to call their displays “LED Monitors” to make customers believe they were getting some new fancy screen technology. In reality all it meant is that the LCD display (yes it is still an LCD panel being used) was using LEDs as a backlight instead of the older CCFL (Cold Cathode Fluorescent Lamp).

add a comment
Subscribe to this Blog Like on Facebook Tweet this! Share on Google+ Share on LinkedIn

Contributors

Contributor Portrait
Todd Hannigan
38
April 26, 2017
show Todd's posts
Contributor Portrait
Christine Alon
22
April 20, 2017
show Christine's posts
Contributor Portrait
Brad Anderson
1
April 6, 2017
show Brad's posts
Contributor Portrait
Matt Stern
3
March 22, 2017
show Matt's posts
Contributor Portrait
Rob Matlow
79
March 20, 2017
show Rob's posts
Contributor Portrait
Sean Sanderson
61
March 13, 2017
show Sean's posts
Contributor Portrait
Sean McParland
16
January 23, 2017
show Sean's posts
Contributor Portrait
Shauna Ramsaroop
13
December 21, 2016
show Shauna's posts
Contributor Portrait
Ryan Covert
47
August 4, 2016
show Ryan's posts

Latest Posts

Show All Recent Posts

Archive

Tags

Everything Content Management Technology Design Holidays Off Topic Support New Features Personal Gaming New Clients SEO REM News Project Management Account Management Training
 
Home Our Work Our Team Our Services WebWiz@rd™ Support Contact Us  

OUR ADDRESS

72 St Leger Street, Unit 2

Kitchener, ON, N2H 6R4

P: 519 884 4111 | TF: 1 866 754 4111 sales@remwebsolutions.com

 

Footer bullet Client Centre Login

 

Accessibility & Compliance Privacy Policy
 
 
© Copyright 2017 REM Web Solutions. All Rights Reserved.
Web Design and Content Management by REM Web Solutions.
 
FaceBook Twitter Linked In YouTube Google Plus our blog