Wednesday, October 26, 2016

Notes from Wired "Inside the Cyberattack That Shocked the US Government"

Wired article, "Inside the Cyberattack That Shocked the US Government" had several phrases that rang true and seemed deserving of as much attention as daily flossing (perhaps not the most glamorous or pleasant of tasks yet with possibly dramatic improvements in one's social standing).

Main takeaways (for me):

  1. "Basic hygiene":
    • ...“basic hygiene”—that is, making simple upgrades that can drastically reduce an organization’s susceptibility to attack. These include measures such as keeping current with the latest software patches, reducing the number of network users with administrative privileges, and, above all, broadening the adoption of multifactor authentication
  2. “Don’t waste a good crisis,” - i.e. use a disaster as the motivation, driving force to implement better security such as "basic hygiene" above.
  3. "Better cooperation" - "between public and private sector" in the article - and within the department and company: sharing information on suspected threats with all stakeholders has the potential to radically speed up both detection and mitigation.
  4. "...fundamental flaw in our approach to security: We’re overly focused on prevention at the expense of mitigation. One reason these attackers can do so much damage is that the average time between a malware infection and discovery of the attack is more than 200 days, a gap that has barely narrowed in recent years."
  5. "The first item groups like these usually swipe is the master list of credentials—the usernames and passwords of everyone authorized to access the network. The group’s foot soldiers will then spend weeks or months testing those credentials in search of one that offers maximum system privileges; the ideal is one that belongs to a domain administrator who can decrypt data at will. To minimize their odds of tripping any alarms, the attackers will try each credential only once; then they’ll wait hours to try the next. Since these hackers are likely salaried employees, investing that much time in an attack is just part of the job.

    "There is a straightforward way to foil this approach: multifactor authentication."
Here is to better teeth - if only security-wise.

Sunday, May 26, 2013

HP workstations and SFF drives

Some HP servers (that are actually smaller than an HP Z820) can fit up to 16 (yes, sixteen) SFF drives.


HP ProLiant ML370 G5
HP Proliant ML370 G5 server with a 16-bay backplane


HP Z820 without side cover
HP Z820
How many can a Z820 fit? Four in standard LFF (Legacy Form Factor) bays (see "8" in the illustration above) and 4-6 more, two per optical bay using an adapter like this:


HP 2.5in HDD 2-in-1 Optical Bay Bracket P/N FX615AA

...or this:



...that fits in an optical 5.25" bays.

That's a total of ten (eight if there's a DVD drive), while the system has fourteen SAS/SATA ports. You can see where we're going with this.

Why SFF (Small Form Factor) drives? For one, they're the present and immediate future of storage. SSDs are mostly SFF or smaller. SFF drives benefit from less vibration and can be made to higher tolerances. More SFF drives can fit in the same space which means higher speeds when you stripe them, which in turn means you may not need a heavy, bulky, expensive external SAS expander box to house a bunch of drives for those uncompressed 4K workflows.

Enter HP 4-in-1 SFF (2.5in) HDD Carrier (P/N B8K60AA) retailing for $139 and what really is Icy Dock MB994SP-4S. Says so on the label.
HP 4-in-1 SFF (2.5in) HDD Carrier

This metal device lets you use four SAS or SATA spinning or SSD drives, and has two fans and four SAS ports in the back, in addition to a single Molex 4-pin power connector.Two of those, and you can have a total of 12 drives in the system. Not quite server grade: HP server drive trays are much, much nicer, support hot swapping with the right controller, and their activity light indicate possible fault and RAID set identification, besides, well, activity. This unit's activity lights are tiny, hard to see, and in my experience, don't always work. We've asked HP to come up with server-grade backplanes for the workstations and still hoping it might happen.

How many of these units can fit in a Z820? Theoretically three as there are three "optical" (5.25" half-height) bays in a Z820, two if an internal CD or DVD drive is used. They're a rather tight fit against the CPU fan housing so choose your SATA cables wisely: those with long necks may not work well.

Was this helpful? Leave a comment.

Monday, July 16, 2012

Did YouTube do away with 4K?

Searching for 4k (4096 horizontally by 2306 or some other number vertically) videos on Youtube today, I realized there aren't any. Including those that certainly were 4K last time I watched them - probably 6 months to a year ago.

If you are interested in videos with resolutions higher than 1080p, you may remember that YouTube announced support for full 4K resolutions in their official blog post of July 2010, "What's bigger than 1080p? 4K video comes to YouTube". No subsequent blog posts seem to announce any changes to supported resolutions. In my blog post shortly after YouTube's announcement of 4K support, I pondered about the reasons YouTube decided to support 4K given the extreme rarity of computer systems and devices that were capable of displaying it. My guess was YouTube was future-proofing itself and playing with possibilities.

Here is the playlist I assembled of 4K 2K clips. In quite a few of them, you see the posted resolution of 4096xN where "N" is the vertical resolution number usually between 2160 and 3072. All of these videos now only play at 2048x1536. You can verify that by right-clicking on the video and selecting "Show video info".

Do you know why and when YouTube decided to stop supporting full 4K videos?

(A 2K video sample. To watch it in full 2K glory, click "Play" first; then select "original" in video settings - the wrench symbol in the bottom right of the player box. Make it full screen. Pause and wait till it buffers if you don't have a particularly fast Internet connection. It helps if your monitor is large with a native resolution higher than 1080p.  How's the viewing experience?)


P.S. 4K is not the same as 4096P, and 2K - is not 2048p.  In "4K", "2K" and similar designations, the number such as 4096 stands for horizontal resolution, i.e. 4096 pixels horizontally by 1536 vertically.  Designations such as 1080p (along with 720p, 480p, 1080i, etc.) refer to vertical resolutions (e.g. 1920 horizontally by 1080 vertically).  In the old analog TV days, signal resolution was also measured horizontally, i.e. in the number of vertical lines, like today's 4K.

Tuesday, June 26, 2012

HP Z820 memory benchmark

HP Z820 is roughly 80% faster in memory bandwidth (speed) vs. HP Z800, in WinSAT memory speed test.  This is very significant for memory intensive applications such as video editing, encoding, etc.

HP Z820 (B2C03UT, one Xeon E5-2630 CPU, 32GB DDR3-1600 RAM running at 1333MHz): 27.6GB/s.

HP Z820 B2C03UT, 32GB RAM
Compare that to HP Z800 (FL878UT, two E5530 CPUs, 18GB DDR3-1333 RAM running at 1066MHz): 15.2GB/s


HP Z800 FL878UT 18GB RAM
To perform the test, run "WinSAT mem" from administrative command prompt ("cmd" in administrative mode).  For instructions, refer to "Quick Disk Benchmark in Windows" post on this blog.

Theoretically, Z820's quad channel configuration (vs. triple-channel in Z800) accounts for a 25% increase; so does faster memory (1333MHz vs. 1066MHz).  The theoretical increase should have been roughly 50% - not quite the 80% improvement we clocked.  I will try to find a faster Z800 and benchmark it as well.

What is the memory speed of your system?

Friday, June 1, 2012

PC Systems Integrators: Doomed?

(image courtesy of  ZDNet)
Jason Perlow's article in ZDNet entitled "Post-PC era means mass extinction for personal computer OEMs" predicts demise of PC manufacturing as we know it: Dell and HP will go the way of dinosaurs unless they make dramatic changes.

More importantly for specialty systems integrators like us, the business as we know it will dry up on several fronts.  One, COTS (commercial off-the-shelf) components will be fewer to choose from due to natural migration from desktops and workstations to tablets and laptops.  Two: video editing is changing dramatically as well, migrating from specialty workstations and desktops to (a) mobile platforms including tablets and even phones, and in large scale (studio) environments, eventually to (b) client-server systems where editing can be done on any screen connected to a private cloud.  Considering the advances in web applications, web-based video editing can't be too far off; its feasibility mainly a question of available bandwidth and its costs.  Tablet and Web-based video editing will require very little integration.

Consider the takeaways:

"Give a developer or someone in the scientific/engineering field a lot of back end server power on a private cloud and a professional monitor attached to a thin client, along with technologies like Microsoft’s RemoteFX for server-side GPU rendering of virtual desktops, and the need for those big desktops could entirely disappear."

If you have any doubts, consider that computer performance acceleration always outpaced demand for it in video applications.  10 years ago, you absolutely needed an expensive workstation to work even with mildly compressed SD material.  Today, HD editing is almost child play on sub-$1K laptops, a $2K laptop will afford GPU acceleration in Premiere Pro CS6.  While editing 4K on an ultrabook is probably a pain, the day is near when it will be a reality.

The demand for desktops and workstations steadily drops.

Are PC and specialty integrators doomed?  Yes and no.  We are doomed if we don't adapt and diversify into areas where integration and engineering will still be needed.  Here are some ideas:

  • Private clouds.  There are already remote access and VDI technologies at work that focus on speed and not just functionality.  Engineering such private clouds will require serious expertise for some time to come.  While no DCC vendor that I know of, expressly supports VDI, there is little doubt it's coming.
  • Focus on mobile.  For every workstation configuration, offer at least one mobile one.  "Desktop replacement" mobile solutions for high resolution video editing will likely need high-speed and possibly fault-tolerant storage and external GPUs; and those are less likely to be purchased from retail than from specialty integrators, for now.
  • Collaborative editing and DCC; MAM (Media Asset Management).  Environments with shared and collaborative access to media assets (think Avid Unity, EditShare, Vizrt, CatDV).
Will some content creators still require high-powered desktops and workstations with stacks of expensive GPUs inside?  Probably.  The real question is, will there be nearly as many vendors and integrators offering them in not so distant future?  As Jason points out in his article, "...to that, the answer is a resounding no."

Monday, April 16, 2012

AJA Announces Ki Pro Quad

AJA Ki Pro Quad and Canon Cinema EOS C500
AJA announced today Ki Pro Quad that accepts RAW 4K over SDI, simultaneously (or subsequently) outputting that data via Thunderbolt, and recording compressed signal to SSDs.  "A scaled or cropped output is also simultaneously available for 2K or HD monitoring via dedicated SDI and HDMI connections", says AJA.

The product will be available "later in 2012" for an MSRP of $3,995 US.

Some details are rather sketchy, probably on purpose:

  • What are the compression options?  If the Ki Pro Quad is anything like Ki Pro, it will be ProRes and DNxHD.  Yes, the same ProRes that is on its deathbed following a spectacular demise of Final Cut Pro 7 last year and mass exodus of editors to Avid, Adobe and even Sony.  DNxHD could not have been timelier.  No CinemaDNG?
  • Thunderbolt (TB) support is coming to Windows this month, although not on the just-released HP's flagship workstations Z820 and Z1.  For now, TB is confined to Macs (iMac, Mac Mini) and Macbooks - which is fine for KPQ, a portable device by design.  Still, with no TB yet on Mac Pros, does KPQ's inclusion of it means there will be a new Mac Pro this year with TB built-in?
  • Uncompressed 4K recording?  Thunderbolt can be used to connect to a storage array.  Will KPQ be able to pipe raw uncompressed 4K video straight to a Thunderbolt (TB) array bypassing a computer?  Probably not, although you can connect a KPQ and a TB storage array to the same TB port on an MacBook simultaneously, which may make this uncompressed 4K recording possible.
(I hope you forgive my calling an already laconic "Ki Pro Quad", a "KPQ".  It just sounds good.)

Enjoy the video: spectacular cinematography, fantastic low lighting shallow DOF, giant close ups of Japanese designers and execs.  I wonder what camera it was shot on? Wow, really?



AJA Ki Pro Quad: Efficient 4K workflows. from AJA Marketing on Vimeo.

Friday, April 13, 2012

Premiere Pro CS6: what’s new and changed

See Todd Kopriva's blog for an extensive list of "what's new and changed".  Here are just a few points:
  • a trimmed, lithe user interface; less waste, more customization including customizable buttons
  • OpenCL support (one some systems) for GPU acceleration
  • expanded multi-camera editing
  • native support for new formats (such as ARRI Alexa)
  • basic color grading built-in; integration with SpeedGrade
  • improved dynamic-linking
  • full-screen playback on primary monitor - yay!
  • uninterrupted playback (while you fiddle with timeline, settings, etc.) - yay!
  • new way for hardware manufacturers (AJA, Blackmagic Design, Matrox, etc.) to write drivers and interface with Premiere Pro; drivers for popular hardware are supposed to be ready by NAB (April 15)
There will be more; stay tuned.

DV411 Digital Signage Solutions