Flipkart Search Bar

Amazon

Showing posts with label personal computing. Show all posts
Showing posts with label personal computing. Show all posts

Intel 4th gen Core processors launch date announced–quite some nanoseconds that is

Intel 4th gen

Source : Intel Twitter Official

Well Intel has announced the time left for the launch of the 4th generation member of the uber successful Core series of processors, named Haswell , and as you can see in the pic, that is the value in terms of nanoseconds. For your convenience, I calculated it down, and it ends up being 38 days from now. Frankly, its really hard to keep excitement down for me, because you can safely expect your personal computer and lappy to break some power limiters with the 4th gen monster inside.

If you remember from earlier, Intel  4th gen processors are to get a voice and gesture control option in them . The Haswell has the same 22 nm microarchitecture as the 3rd gen Ivy Bridge, but it has better energy efficiency, more power (double the CPU and GPU as Ivy Bridge), and many expected security advancements. It will be upto a quad core version only, inspite of many wanting octa core PC’s after Exynos 5 Octa core in Samsung Galaxy S4, but that will be the case for the Intel 5th generation Broadwell , which will be made in 14nm microarchitecture. Though I am wishing for things to reach to 16 cores by then. Intel 4th gen Haswell has , by the way, new cache design, and more excitingly, possible support for the 20GB/s Thunderbolt !!!

Overall, the reason I postponed buying a laptop has at last arrived, and maybe we will see some great devices with ‘Intel inside’. Stay tuned for the latest updates on this new family of processors, as well as devices powered by them.

Are More Expensive LCD Screens Actually Better Quality?

Getting a good computer with an LCD screen that's bright, sharp, and will last a long time is important. That's especially significant if the computer is one that's used for work or used by multiple people for important or complex projects. One of the biggest misconceptions is that more expensive LCD screens will provide for better quality. Sometimes that's the case, but often the price of the screen has little to do with how good it looks or how long it lasts. Because that's the case, a lot of people are overpaying for LCD screens.


Whether they buy a computer with a built-in screen (like a laptop) or they buy one where the screen is an extra component (a desktop model, for example), there may be a time when the screen becomes damaged and needs to be repaired or replaced. How many a person wants to spend - and how much he or she can afford to spend - both matter when getting a new screen. Even just having an LCD screen repaired can be costly, depending on the size of the screen and the damage that has been done to it.

Source : Avreview
One of the most common injuries to an LCD screen is from someone dropping the computer or other device. People can also drop things onto the screen, and sometimes the wires and other connections can simply wear out over time. That's especially true with laptops, because they are opened and closed so much. No matter the cause of the screen damage, repair or replacement is available. People who can't spend a lot might think that they won't get good quality, or that the screen they receive is substandard in some way. That's really not the case. Screens are screens, and they all have to be made in a particular way.
Anytime an LCD screen is created there are specific components that are put together, and then they are tested to make sure they work. They come with warranties and must meet certain standards, so they are basically the same. Specific brand names will cost more, and some do have a better reputation than others, but at the end of the day there is suprisingly little difference between makes and models of LCD screens. Anyone needing a screen replaced or repaired should be sure to focus on the quality of the person or company doing the job, and not so much on the brand of screen that he or she is getting.


About the author - Louis Rossmann is a straight shooting tech guy who specializes in repair and LCD replacement.  Contact him through his Rossmann Supply website.

USB 3.0 upgrades to a mind numbing 10 GBPS

USB 3.0 is set to get a speed boost from 5 Gbps to 10 Gbps (Image: Shutterstock)


USB promoter group announced at the ongoing CES 2013 the enhancement of USB 3.0 (aka SuperSpeed USB), doubling its data transfer rate from 5 GBPS to 10 GBPS, with enhanced connectors, which will be , to general relief, backward compatible wit earlier USB 3.0 and USB 2.0.

This makes USB 3.0 come to the level of Thunderbolt, which also offers 10 GBPS transfer rate. In spite of thunderbolt devices coming up, the new USB 3.0, with its loyal manufacturing firm following, is expected to take over soon

 The SuperSpeed USB 3.0 is up for industry review in q1 of 2013, with completion by mid year

Source : USB 3.0 Promotor Group

Lenovo brings up 27 inch windows 8 table PC


Lenovo ups the ante with 27-inch Windows 8 table PC
So Lenovo starts off with a bang at this year's CES,  with its IdeaCentre Horizon table PC (notice this is table, not tablet ). This flat lying AIO PC is marketed as a multi user, multi touch, multi mode device , promoting a family sharing experience.It is like the prototype Microsof Surface Table PC (though that is 40 inch, with 40 point multi touch and 3D sensors,and more  so i won't compare the two here).

IdeaCentre Horizon runs on Windows 8, has 10 point multi touch screen,  but cool thing is that it has many customized pre loaded games from EA and Ubisoft!! :D

This one ain't half bad in terms of specs either : Intel core i7, NVidia Geforce graphics , Full HD screen, a stand. Lenovo also provides e-dice, 4 joysticks and striker for gaming on the screen. It's priced at a fair high $1699 (which is less than the almost $3000 of Microsoft surface table xD... )


Along with it, lenovo also showcased a concept table PC with a huge screen called 'Gamma'.

Source: Techradar

ASUS sytem to get a 3D motion control "Leap"


The Leap Motion controller that will be bundled with ASUS systems


First the Kinect powering PrimeSense, and now ASUS has joined hands with Leap Motion, company responsible for sensor that enables system control with finger and hand gestures. 3D control will be bundled up with the high end ASUS notebooks and All-in-one(AIO) PC's by the end of 2013.
The Leap motion Controller tracks all 10 fingers of the user, at a lightening fast rate of 290 frames per second, tracking movements upto 1/100th of a millimeter (that is 10 micrometers!!!). The company claims it to be 200 times more accurate than other competitors , such as Microsoft Kinect for windows.
Although , you can pre order the accessory from the company's website for $69, but seeing its immense potential, its certain that it'll be integrated into future systems, and not come as a separate peripheral.
Albert Wu,  Desktop Division Senior Director at ASUStek, said"Leap motion has developed a exciting technology that will truly enhance the experience our customers have with their ASUS devices, opening a world of opportunity for personal use and business, from entertainment to architecture to education."

Check out the controller in action in the video below, and do leave your comments!!


Eye tracking by Tobii Rex for Windows 8


Tobii's REX (Developer Edition pictured) brings eye-tracking capabilities to Windows 8 PCs

Working on eye tracking technology since the last decade, and showcasing the Gaze UI for windows 8 last CES, they are ready to set their first consumer level eye tracking device in front of the world at the upcoming CES 2013, which brings the eye tracking functionality to any windows 8 based PC.It can scroll, zoom, navigate and select by tracking the viewers eyeball.

Attached at the bottom of the screen, connected with a USB cable, it is much like the Tobii PCEye, for the
users with impaired motor skills. However, the Rex is to be used in combination with the keyboard , mouse and touch controls, and not replace them.

This year, only 5000 limited edition units will be prepared by Tobii , whose pricing is yet to be announced.However, Rex developer edition is currently available for devs to create and improvise various apps functionable with it. It costs around $995 for the devs.

The pics currently shared by Tobii are that of the developer edition, but I expect the consumer edition to be a bit slimmer, glossier.XD
CES 2013 is what we got to wait for again!

Source: Tobii

Professor's algorithm writes technical reports, romance novels could be next


Philip M. Parker has created a computerized system to automatically compile data into book...
Philip M. Parker has created a computerized system to automatically compile data into book form (Image: Shutterstock)

Philip M. Parker, a marketing professor at INSEAD (the European Institute of Business Administration), has written and patented a system that uses an algorithm to automatically compile data into book form. Between his works and those of his research group (ICON Group International), he has over 900,000 books currently for sale on Amazon. More than a smart search engine, his system only requires a few minutes or a few hours to scan the databases relevant to any given topic and organize that data into a technical report. Next stop? Romance novels.
There are few things in life quite as boring as writing a technical report. You accumulate all available data on the topic, then categorize and prioritize the information. A general structure within with which to present the data is then chosen from a few common structures, whereupon the collected and sorted information is presented as a report. Such reports are formulaic – produced in accordance with a slavishly followed rule or style, and their generation is largely a process of intellectual drudgery requiring very little creativity. It's exactly the sort of task for which computers were developed.
Prof. Parker, an author of several conventionally written technical and business reports, realized one day that the process of writing such a report can be described in terms of a reasonably well defined algorithm. He then set out to program a computer to carry out this algorithm, for which he was issued US Patent 7,266,767 (Method and apparatus for automated authoring and marketing.).
Parker designed the algorithm to follow (hopefully closely) the path that an expert would take in writing a summary about a data-rich subject. There are similarities to IBM's Jeopardy grandmaster Watson, which also casts a wide data net, then organizes and summarizes the data so it can respond rapidly to questions.
Some examples of books written by Parker's program include:
  • Satirists: Webster's Quotations, Facts and Phrases
  • The 2007 Report on Little Cigarette-Size Cigars Weighing Less Than 3 Pounds Per 1,000 Cigars: World Market Segmentation by City
  • Webster's English to Portuguese Brazilian Crossword Puzzles: Level 10 (Portuguese Edition)
  • Webster's Hiligaynon - English Thesaurus Dictionary
  • The Official Patient's Sourcebook on Blepharitis
While some of these titles seem difficult to believe, there is a (generally small) market for each of them. If a person or company needs in-depth data about a subject, however narrow, that data has value in organized form. Parker's program works especially well with modern distribution technologies, turning print-on-demand into written-on-demand.
Parker has also used an outgrowth of his algorithm to write a comprehensive set of poems about roughly 80,000 words in the English language. Totopoetry is a collection of algorithmically authored poetry that neatly illustrates the strengths and limitations of algorithmic writing. Poems are written in 17 styles (e.g., Haiku, limerick, sonnet) for each word in English.
Each poem is intended to illuminate the meaning of the word on which it is based. For example, the octosyllable poem for "poetry" reads:
         "Really instant and overt.
         But also distant and covert."
 – Totopoetry
And then there is the modern Haiku form, again on "poetry":
         "An executive,
         evokes many directions,
         numbers and manners"
 – Totopoetry
At present Parker's algorithm cannot judge its work against some intrinsic or personal measure of poetic merit.
The next area of formulaic writing to which Parker wants to adapt his algorithm is romance novels, which are widely (perhaps unfairly) denigrated as "cookie-cutter" literature. Parker believes their simplicity and limited plot structure suggest romances as the best target for an early attack on fiction writing. Regardless of his level of success, human authors are likely to face progressively more competition from algorithmic authors over the next decade or so. At this point it seems likely that the place of the best human writers is probably safe, but for how long? Time will tell.
Source: ExtremeTech.com , Gizmag

HP targets enterprise users with EliteBook Revolve convertible tablet


HP's EliteBook Revolve tablet is aimed at enterprise users
With the success of the iPad, it’s easy to forget that convertible units with a rotating hinge and physical keyboard were once the form factor of choice for tablet computers. While touchscreen-only devices now dominate the consumer space, convertible tablets continue to find a market, particularly amongst business and government users. It is these markets that HP is targeting with its latest touch-enabled convertible tablet, the EliteBook Revolve, which is due for a March, 2013 release.
Powered by third-generation Intel Core processors and with 4 Gb of RAM, the EliteBook Revolve features a touch-enabled, 11.6-inch, 1,366 x 768 pixel resolution display that pivots and twists to switch between traditional laptop and tablet form factors. The display is made from scratch-resistant Gorilla Glass 2, while the all-magnesium chassis helps keep the unit’s weight down around 3.04 lb (1.37 kg) – that figure is based on the preproduction model, as is the estimated 22.4 mm thickness of the device.
The full-sized physical QWERTY keyboard features a backlight and is spill-resistant, while the screen orientation and brightness will adjust according to the unit’s position. Connectivity features include optional WWAN (LTE in the U.S. and HSPA+ elsewhere), secure NFC, Bluetooth, Ethernet port, two USB 3.0 ports and a DisplayPort.
HP's EliteBook Revolve features a rotating hinge to switch between laptop and tablet form ...
Onboard storage comes in the form of a 256 GB SSD, with a 720p HD4 camera positioned above the display, a dual-microphone array, DTS Studio Sound and CyberLink’s YouCam software also on board. An optional pen is also available for touchscreen input. The battery should be good for eight to ten hours of use, and 210 hours of standby.
Although it has been optimized for Windows 8, HP will offer the option of preloading the EliteBook Revolve with Windows 7 for those not yet prepared to jump on the Metro train.
The EliteBook Revolve is that latest bid by HP to grab a slice of the growing enterprise tablet market. In October it announced the business-orientedElitePad 900 boasting a rugged form factor and compatibility with Smart Jacket peripherals. The company has also announced a new Multi-Tablet Charging Module that can store and charge up to 10 tablet PCs with screen sizes under 10.1 inches from a single power outlet.
Like the ElitePad 900, the Multi-Tablet Charging Module will be available in the U.S. in January. It will be priced at US$499. Pricing of the EliteBook Revolve will be announced closer to its March, 2013 release date.
Source: HP , Gizmag

IGN's Top 007 Bond Gadgets , Internet searching without typing, Raspberry Pi, and lots more!!

Here is a collection of videos on some neat technologies of 2012 !!!
Enjoy the watch and feel free to comment and discuss these!!!
:D


How it's possible to play high-end games on ultraportable laptops


How it's possible to play high-end games on ultraportable laptops

Gaming on a laptop has traditionally meant using massive desktop-replacement beasts tied to the power socket, with no hope of fun on the road.
On the flip side, trying to play modern titles on a machine with integrated graphics has generally meant staccato frame rates in the single digits.
But what if we told you that it needn't be that way? What if we told you that on an Ultrabook with only HD 4000 graphics we could have Crysis 2 running smoothly, and without too much sacrifice either?
Lucid Logix is a name that will be familiar to most readers as the company that allowed folk with Z68 or Z77 motherboards to use discrete graphics cards and still have access to the funky Quick Sync bits of the Ivy Bridgeand Sandy Bridge chips.

We went to see Lucid while we were over in San Francisco for IDF. Usually when we say that we mean we saw a representative, but not this time - we actually saw pretty much the entire company. A good chunk of its small team was in the room with us as Offir Remez, president and MD of Lucid, took us through the demos of the latest goodies.
Functional, but not too sexy, right? Its new Dynamix software, though, can double gaming frame rates on integrated graphics, giving laptops without discrete GPUs serious gaming chops. Lucid Logix is a tiny company with big ambitions, and now it's got the software to match that ambition.
We saw its Virtu MVP Mobile software running on a laptop and a concept external GPU set up via a hot-swappable Thunderbolt connection - but it was the new software running on an Ivy Bridge Ultrabook that really impressed.

Crysis management

Gaming on an Ultrabook explored
The little laptop, with its relatively feeble HD 4000 graphics, had Crysis 2sitting on it. While it's not quite the crazy-demanding game its predecessor was, it's still a graphics hog, so on the surface it might seem unfair to put the poor machine through the wringer with it.
And with the machine barely managing to hit 9fps it seemed like a pretty pointless exercise - nobody is going to play at those frame rates. That's where Lucid's Dynamix software comes into play, though.
A quick press of a pre-ordained key to enable it while still in the game, and suddenly the FRAPS frame rate counter jumped up to over 20. Suddenly it was playable and much, much smoother. A credible gaming experience on an Ultrabook - what voodoo is this?
It's a software-based solution, requiring no extra hardware and - in a first for Lucid - operating on a single graphics processor.
"We take everything we know how to do," says Offir. "We know every frame going into the pipeline. We capture it before, we analyse the tasks, we know what it's going to do. We sometimes distribute it between the CPU and GPU, and sometimes different GPUs.
"We said, 'Can we use that in a one GPU environment and walk the fine line between quality and performance?'" he continues. "Would you give up a small percentage of quality - we are playing with pixels here - to double performance? Let's say 2 per cent quality to double performance."

Dynamic Resolution Rendering

Gaming on an Ultrabook explored
What Lucid is doing here is based on something Intel itself passed around at this year's Games Developer Conference (GDC) back in March - something called Dynamic Resolution Rendering. It was a concept which allowed better frame rates on lower powered hardware, while still retaining much of the visual clarity you want with high-resolution gaming.
But nobody wanted to know. The extra code needed to add this into the developers' game engines obviously wasn't seen as worth it for individual titles on a platform as seemingly niche as the PC.
Lucid though has taken this away from the games themselves, and is creating an ecosystem that it can add to a machine to enable the resolution switching in any game on the fly.
The essential idea is to dynamically adjust the resolution of the 3D scene so that it can run smoother and faster, while still keeping the GUI/HUD of the game rendered in the native resolution. That way the overlay doesn't expand and end up taking over the screen - as it would if you dropped resolution as a whole - and remains clear and crisp and out of the way of the 3D scene.
As Lucid's demonstration showed, dropping the resolution of the actual 3D scene itself this way doesn't harm the image quality too much, and adds a whole heap onto the performance side. You can also, as Lucid is doing with Dynamix, offset much of the image degradation of dropping resolution by using less GPU-intensive post-processing effects to help smooth things out.
The trade-off then is visual clarity. Because the new technology is enabled on the fly, you can immediately see the loss of fidelity - there's a faint smudging visible around the edges, like you'd see anyway running the game in a non-native resolution.
Gaming on an Ultrabook explored
But when you're switching from unplayable-but-sharp to smooth and a little less clear, it's a pretty easy choice. And Lucid hasn't finished optimising yet and is confident it can sharpen things up more in future iterations.
If you want a completely high-end, high-resolution gaming experience then you're still going to need a discrete GPU. But if you just want to play a 3D title with smooth frame rates on your Ultrabook/integrated graphics processor, you're not going to be that bothered about a little loss of clarity.

At the moment Lucid is only looking at this in the mobile sphere, but we also spoke about whether the same could be applied to small form factor machines, the sort of little PCs you stick under your TV for media functionality

.Destop Dynamix

From the sofa the slight smudging is going to be barely visible, and with Valve and its big-screen gaming Steam initiative gaining traction, having a wee PC capable of gaming on your TV is actually quite desirable. This could really open up PC gaming to a whole new section of the PC world.
Now Intel is starting to take notice again and so are the laptop manufacturers. Lucid didn't fully realise just how well-received the software would be and is now being tasked with using it in the first round of Haswell laptops due for release in the middle of next year.
And if the 2x GPU performance of the 4th Generation Core Architecture holds true that could mean 40fps in Crysis 2 on an Ultrabook. Now that's tantalising.

Source : techradar

PC tech in 2013: what to expect


PC tech in 2013: what to expect
Has 2012 been a vintage year for the PC? Things have certainly started to hot up with the release of Windows 8 and Microsoft Surface.
But epic in the sense of a computing platform going through a major transition. Exactly how successful that transition turns out to be we'll have to wait and see.
What is absolutely guaranteed, however, is that by the end of 2013 there will be PCs unlike anything seen before. New form factors. New capabilities. New value propositions.

What is a PC? It's changing...


So is an ARM-based PC really a PC? Similarly, is an ultra-mobile device powered by an Intel x86 chip but running Google's Android OS a PC? Or does a PC only mean the classic Wintel alliance of Microsoft Windows and Intel x86 processing?
The answer to that question will become increasingly tricky during 2013. Microsoft has released a version of the Windows operating system that's compatible with ARM processors, for instance.
Acer Netbook
It's x86. But it runs Android. Is it a PC?
As the lines become increasingly blurred, perhaps it's device types that will matter, not the notion of a PC.

Tablet conversion

While that's playing out, devices that can definitely be called PCs in terms of ye olde Wintel thing will increasingly be available in tablet format.
The poster child for full-on PC tablets is obviously Microsoft's Surface Pro, due out early in 2013. On paper, it looks like one device to rule them all.
It's a proper x86 PC with an Intel Core i5 processor. It runs the full version of Windows 8 complete with the powerful desktop interface and compatibility with bazillions of legacy applications.
MS Surface
Is this the touch-feely future of the PC?
But it's also a tablet device with the Windows Modern touch interface. So, unlike Apple's iPad or any number of Android tablets, it's not an as-well-as device. It's instead-of. Instead of a laptop, that is.
So you'll have full touchscreen tablet functionality combined with traditional laptop content creation capability in a single, ultraportable device. Brilliant.
The only problem with Surface Pro is pricing. It'll probably cost £800 or more. Mercifully, the world and his dog will be producing tablet convertibles and touch-enabled laptops in 2013. So prices will tumble over the year.

Intel's Next Unit of Computing

It arrived at the end of 2012. But 2013 will be the first full year for Intel's Next Unit of Computing or NUC.
Superficially, NUC is just an ultra-compact PC little different from, say, a Mac Mini or any of a number of super slim boxes.
But NUC is important in terms of the predictions it makes about the shape of PCs to come. The basics of NUC involve high levels of feature integration, solid state storage and a compact, flexible form factor.
Intel NUC
Intel's NUC: it's nothing like a Mac MINI, umkay?
Most importantly, NUC predicts future PCs powered by SoCs or system-on-a-chip devices. Gone will the the concept of a motherboard into which you plug a CPU and a graphics card. It's all be on ona single chip.
Currently, the only problem with NUC is pricing. Even a basic model costs over £400 once configured with an SSD, WiFi and memory. And that makes it look very poor value compared to ultrabooks that offer all that, but also a battery, a screen and, soon, touch capability.

Chips with everything

The big news from Intel in 2013 will be Haswell. It's Intel's next big CPU redesign and it pretty much lines up with all the other PC related trends for 2013.
So it won't be a major step forward in terms raw CPU performance. Instead, it's another step towards that system-on-a-chip end game Intel is aiming for.
Intel Haswell
The Intel machine is gearing up for Haswell
With that in mind, Haswell's graphics take a big step forward. Hard numbers haven't been released, but performance getting on for double Intel's current processor graphics is probably a realistic expectation, in some applications at least. If so, that will kill the bottom end of the graphics card market stone dead.

A year of reckoning for AMD

As for AMD, we were hoping its desperately needed new Steamroller CPU design would rock up in 2013. But that's looking increasingly unlikely.
Instead, what 2013 will likely bring will be closure on the basic question of AMD's survival. By the end of next year, we'll very likely know whether AMD can survive in the long haul.
If AMD does die, 2013 could see the launch of the very last family of Radeon graphics chips. The fact that NVIDIA probably won't bother to launch a new high end graphics chip in 2013 makes that all the more significant. 2013 could be the year the graphics war is finally won by NVIDIA.

Of screens and SSDs

Elsewhere in PC hardware, the familiar tale of incremental but relentless technological advance will continue.
Peak performance for solid state storage probably won't improve dramatically. But random access will, as will price-per-GB for SSDs, Given that the only really problem with the latest SSDs is pricing, that's good news.
It's a similar story for screen technology. 2013 probably won't be a year for revolutionary changes. OLED screens, in other words, are unlikely to go mainstream.
But IPS panels should become ever more commonplace now that consumers have been given the hard sell in tablets and phones.
iPad MINI
Popular IPS panel tech: First tablets, now PC monitors
Indeed, on the subject of tablet and phone tech finding its way into PC screens, there's also a chance that high DPI panels could begin to pop up in PC monitors.
Apple has already stuck a few into its Macbook portables. If it commissions high-DPI panels for a new cinema display, expect that to kick off a broader high-DPI trend for PC monitors.
Here's hoping for 4k 30-inch panels and perhaps 2,560 x 1,440 pixel panels in the 22 to 24-inch segments.

Source : techradar

Sharp unveils 32-inch 4K2K LCD monitor


Sharp has announced the forthcoming Japanese release of a 32-inch 4K2K computer monitor
Sharp has announced the forthcoming Japanese release of a 32-inch 4K2K computer monitor

In common with many of today's digital content junkies, I get my daily entertainment fix from a computer screen and not a TV. Even if I could afford to buy into the jaw-dropping Ultra HD image quality I witnessed from the giant goggle boxes being showcased by ToshibaSony and LG at IFA 2012 in Berlin a few months back, they'd likely spend much of their time powered off. As such, the upcoming release of a 32-inch 4K2K computer monitor from Sharp would be of great interest, were it not being aimed specifically at the business community in Japan.
Sharp's PN-K321 32-inch LCD monitor's crisp and clear 3840 x 2160 pixel resolution (four times that of full HD) at 140 ppi pixel density should attract a spatter of appreciative applause from video/graphics professionals, CAD users and others looking to squeeze a large amount of information onto one screen without loss of detail. It features the company's proprietary IGZOtechnology which provides LED backlighting from the monitor's edges and in-so-doing allows the company to reduce the depth of the unit to just 35 mm.
Sharp's PN-K321 32-inch LCD monitor has a resolution of 3840 x 2160 pixels, at a density o...
The 4K2K monitor also benefits from a wide 176-degree viewing angle, a response time of eight milliseconds, 250-nit brightness and an 800:1 contrast ratio. The PN-K321 is compatible with the latest DisplayPort and HDMI specifications, allowing for single-cable PC connection. It also sports two integrated 2W speakers and 3.5 mm audio in/out jacks.
Sharp says that the new model will be introduced in Japan as of February 15. Unfortunately, at the time of writing, there's no official word on if or when the unit will be released elsewhere.
Source: Sharp , Gizmag

Steam says living room PCs will take on games consoles next year


Steam says living room PCs will take on games consoles next year

Steam's quest to take over your living room game experience is gaining momentum. After launching "Big Picture" mode last month to help games play better on a big screen, the company has its sights set on PCs specially designed for connecting to your television.
Talking to Kotaku at the VGA awards, Valve boss Gabe Newell claimed that he expects lots of companies to begin selling computer packages specifically designed for the living room from next year.
These PC packages will come with Steam ready to run, and take advantage of the Big Picture mode to allow for an immersive PC gaming experience in the lounge room.

Entering the hardware game

According to Newell, Valve will also sell its own hardware to connect Steam to your flat panel television.
The Valve hardware, however, will be a lot more locked down than a traditional PC.
Of course, PC gaming fans who want to tinker with the internal components will be catered for by third party manufacturers, according to Newell.
The launch of Steam PCs for the living room will put even more pressure on the next generation of gaming consoles, opening up the market in a big way.
Via: Kotaku , Techradar

Anatomy of a hard drive: what really goes on inside your PC's storage


Anatomy of a hard drive: what really goes on inside your PC's storage

Here's a thought: what's the most valuable component inside your PC? Valuable to you that is - not in terms of resale value.
PC kit doesn't really make for good family heirlooms, unless your grandfather bought an AdLib sound card in his final days and it was passed on to you.
But for most of us, the most valuable component is definitely the hard drive.
If a CPU blows up or a graphics card buys the farm, we can simply buy new ones. But if a hard drive says "goodbye cruel world", taking all of your vital files with it (and you don't have recent backups), well, no amount of money can fix that.
And yet, despite its importance, the humble hard drive doesn't get much attention.
We all have a tendency to focus on flashy things such as new distros and desktop environments, but there's a wealth of useful information to discover and learn about these devices.
For instance, there are many different strategies for splitting up the disk into different chunks (partitions), affecting security and performance. There are different types of filesystem you can use, and tricks you can employ to recover data if something goes wrong.

Anatomy of a Hard Drive

New technologies, such as SSDs, are changing the role of hard drives. If you've accidentally deleted a file, there's still a chance that you can recover it thanks to some cunning tools.
So far from being a boring box of bytes stuffed into a random space in your PC, the hard drive is actually a world of technology, with many options for customisation.
Our aim in this feature is to teach you everything that's worth knowing about hard drives - and a little bit more as well. We've also included a few bits you can cut out and stick on the wall next to your PC, in case you have an emergency.
Just to be on the safe side (for us and you!), a quick disclaimer: this guide covers making modifications to the structure of hard drive data. We absolutely recommend trying out commands and options for yourself, as it's the best way to learn, but only on a test machine (or in VirtualBox).
Don't experiment on your main PC, unless you want to risk losing data!

What are partitions?

A blank hard drive isn't much use to anybody; it needs some structure before it can start storing files. From a low-level perspective, drives are made up of sectors, which are very small units of data storage at fixed locations on the disk.
There can be many millions of sectors in a drive, and they are organised into meaningful groups at multiple levels. First off, at the foundation level, we have partitions (we'll look at filesystems later).
Essentially, a partition is a collection of sectors assigned to a specific storage task. Most brand new PCs ship with only Windows (sadly), so in their hard drives there is just a single, large partition that occupies almost the entire disk.
This appears as the C: drive when Windows boots up. Some machines have a second 'rescue' partition, containing a backup of the OS for when it needs to be reinstalled.
The purpose of partitions is to keep data areas separate from one another.
When you install Linux on a Windows PC, for instance, the Linux installer typically shrinks down the Windows partition to make room for Linux ones. At the end, you have a drive with multiple partitions, as in the diagram.
An open alternative
Windows knows it shouldn't mess around with unrelated Linux partitions, and vice-versa. The sizes of these partitions vary from system to system, depending on how much you allocate to each OS.

Cut out and keep: emergency partitioning

The fdisk program is much like the Vi text editor, but for partitioning: it's terse, minimal and available in virtually every distro. Start it (as root) by providing a drive path (to the device node) like this:
fdisk /dev/sda
In a typical Linux installation, /dev/sda refers to the first hard drive, while /dev/sdb refers to the second, and so forth. Enter p and you'll see a list of partitions on the drive, as in the screenshot.
Note here the Start and End columns, which show sectors in use. Each partition has a number, so sda1 is the first partition on the first drive, and sdb3 is the third partition on the second drive.
To delete a partition, enter D and you'll be prompted for the number.
To add a new partition, enter N. You'll be asked whether to make it primary (maximum 4) or extended; go for the former if you have room, for simplicity's sake.
Then enter a start sector number (taking into account the list earlier) and size. Back at the main prompt, enter P and you'll see the new partition in the list.
It has no ID at the moment, though, so enter T, then the partition number, and then Shift+ L to list available types. Enter 83 for a Linux partition, 82 for a swap partition, or 7 for a Windows (NTFS) partition. Now enter W to write the changes to disk, or Q to quit without writing.

Linux and Windows partitions on a hard drive

>An fdisk session, showing the Linux and Windows partitions on a hard drive.

A separate /home: yes or no?

One of the biggest choices you face when installing Linux and partitioning a hard drive is this: do you put the / home directory on a separate partition?
This is where user files live - that is, personal documents and settings for user accounts, as opposed to operating system files, which live in separate directories.
Some Linux distributions recommend using a separate partition, whereas others default to dropping everything into the same partition. So, what do you do? The answer depends on how you want to use your machine.
If you plan to try many different distros, and you're often installing new ones over the top of old ones, then it makes sense to have a separate /home partition.
In this way, you can do what you like with the operating system - upgrade it, downgrade it, or wipe it all and try some random new distro from the Faroe Islands.
Whichever Linux flavour you happen to be running, your personal files will always be there, stored safely on a separate part of the disk.
If you're careful, you can even have multiple Linux distributions on the same machine, all using the same partition for the /home directory after booting.
But why do we say you have to be careful? Well, think about settings and configuration files. If you do an ls -a in your home directory, for instance, you'll see a large number of hidden files and directories starting with full-stops - these store settings for programs. If you try to use the same settings between different versions of a program, it can really confuse that program.
For instance, let's say you have Distro A and Distro B on your machine. You boot Distro A and run FooProgram 2.0 for the first time, which creates a .fooprogram/ settings folder in your home directory.
Then you boot Distro B with the same home directory, and start FooProgram - but in this case, it's version 1.0. It'll get confused by differences in the configuration files, and could crash or corrupt data. Another potential problem with separate /home partitions is the size constraint.
If you put everything on one partition, then the OS and home directories both have access to free space.
If you put /home on a separate partition and run out of room, you can't easily take space from the OS partition (if you use LVM, the Logical Volume Manager as offered during the installation phase of many distros, you can overcome this, as it supports partition resizing).
There are plus points to the separate-partition approach, though, especially now that SSD (solid state) drives are becoming more affordable and popular.
Because they're screamingly fast in comparison to spinning hard drives, you could put the OS files on an SSD for fast system and app startup times, and then /home on a traditional hard drive (after all, you're not too bothered how long it takes your LibreOffice documents and photos to load).
For general spinning hard-drive installations for home desktops, where you're not going to be trying a new distro every other day, though, we recommend the 'putting everything in one partition' approach.

The most important directories

Have a look in the root (/) directory, and you'll see lots of directories that may be unfamiliar to you.

You may see a lot of unfamiliar directories

>The root (/) directory might look like a jumbled mess of random words to Unix newcomers, but it actually makes sense. Everything has its own place.
While most users rarely need to venture into these directories, it's worth knowing what they do:
>/bin Binary files, or more specifically executables that are used by the base system. However, this doesn't include larger desktop applications, such as Firefox (they are kept in /usr).
>/boot Files used for booting, such as the Linux kernel.
>/dev Device nodes. Here are files which can be used to access hardware devices.
>/etc Configuration files for the system (per-user settings are stored in the /home directories).
>/media Removable media, such as USB keys, are often mounted here.
>/mnt Another place for mounting drives (rather confusingly), but usually hard drives or network shares.
/opt Optional application packages.In some distros huge beasts of software, such as KDE or LibreOffice, live here.
/proc Process information. Only really useful for admins wanting to monitor a program's resource consumption.
/sbin Critical executables for running the system, but which should only be executed by the super user (root).
/usr This contains non-critical files, such as applications. Inside is /usr/lib, which contains most of the libraries used by apps.
/var Variable files - ie, data which changes a lot, such as databases, mail spools and system logs.

Cut out and keep: sync your disk

Here's something that might surprise you: when you save a file in a program, it doesn't actually get written to the disk straightaway. At least for small files (eg, less than a megabyte), anyway.
For performance reasons, operating systems don't write data to the hard drive at every request, but wait until there's a lot of data from multiple write requests.
So, the OS stores all of these write operations in a RAM buffer and then commits them all to disk in one fell-swoop. If you've ever been unlucky enough to have a power cut a few seconds after hitting Ctrl+S in a program, for instance, you'll have seen this in action.
Fortunately, there's a solution. At any time, you can enter sync in a terminal window to guarantee that all write operations are written to the physical disk.
And then there's a special key sequence you can use if the X Window System freezes, ie the graphical layer has totally locked up, but you want to sync everything to the drives and reboot safely. It's called the Magic SysRq key, enabled in most distros, and is enabled as follows:
Hold down Alt+SysRq (usually top-right on the keyboard) and then press the following keys in order:
R (get keyboard control back), E (terminate processes), I (kill errant processes), S (sync data to disks), U (unmount drives), and B (reboot).

What's in a filesystem?

A hard drive without a filesystem is just a jumble of data. The filesystem helps the operating system to make sense of the disk - finding out where files start, where they end, and which directories they belong to.
In a simple filesystem, such as DOS FAT, you have a table in the first few sectors, describing where the files are located. Every file has an entry in this table (which is why most filesystems have a limit on the number of files), containing its name, the time it was created, how big it is in bytes, which sector it starts at, and so forth.
By far the most common filesystem in the Linux world is ext4, which is an excellent, reliable, general purpose filesystem for hard drives.
For a while, there was some competition in the Linux world in the form of ReiserFS, an innovative filesystem whose development suffered a setback when the lead developer was charged with and later convicted of the murder of his wife…
There are other filesystems worth being aware of, typically suited to more specialised tasks than a regular desktop PC.
ZFS, for instance, provides great performance and reliability across multiple disks, as covered in our FreeNAS tutorial on page 90.
Then there's LogFS, designed to be used on flash drives (which internally work completely differently to spinning hard drives, and therefore can benefit from a dedicated filesystem).
It's interesting to note that with all the advanced filesystems in use today, on both Linux and Windows, typical USB flash keys come pre-formatted with FAT32 (and its limitations). It feels a little bit strange to use such backward technology today, but it does mean that these keys are compatible with virtually everything.

Cut out and keep: recover lost files

Making backups is the single most important thing you can do for your data. And the second thing is making even more backups.
But despite good intentions, we can all make mistakes and accidentally delete files. Due to the way modern filesystems work - shuffling files around on the drive to avoid fragmentation issues - file recovery software isn't always 100% successful. But there's hope.
First, get hold of a rescue distro with Photorec installed. One of the best is Recovery Is Possible, aka RIP, a mini distro designed for fixing damaged Linux installations.
Burn it to a CD-R and keep it handy near your PC for emergencies.
When you want to recover an accidentally deleted file, shut down the machine and boot RIP. In a terminal, enter 'photorec' and you'll be prompted to select the drive and partition that contained the file.
Then you'll be asked for the filesystem type, and whether to scan the whole drive or just space marked as empty (the latter is quicker). Finally, you'll be asked for a location to store recovered files.
Afterwards, recoverable files will be placed in recup_dir folders, followed by a number. You won't have the original filenames, so an image such as kitten.jpg could become f0015362.jpg.
If you have lots of files, you'll have to look at them and rename them manually. But at least you have the data back...

source : techradar

Is Apple bringing Siri and Maps to OS X?


Will iMac owners be asking Siri the meaning of life?

If you ask someone about Apple's biggest success, you won't likely hear the words "Siri" or "Maps." Siri is inconsistent and still in beta, while iOS Maps was panned for taking a huge step back from Google Maps. A recent report, however, reminds us how important both are to Apple's future: the two services are rumored to be included in the next version of Mac OS X.
According to 9to5Mac, early test builds of OS X 10.9 include the virtual assistant and navigation service. Apple supposedly began working on OS X 10.8 and 10.9 at the same time. Mountain Lion was heavy on iOS features, like Reminders, Airplay, and Notification Center. 10.9 will supposedly pick up its leftovers, highlighted by these two.

Details

As pretty as Flyover is, Apple would be wise to keep quiet about Maps until it's improved
As pretty as Flyover is, Apple would be wise to keep quiet about Maps until it's improved
It's possible Maps would also be a stand-alone app, but the report only mentions it as a framework for developers. If it makes it into the final version of 10.9, devs will have the option of embedding maps into their Mac App Store apps. Whether they'll want to, after the Maps backlash, is another question.
As for Siri, it would supposedly look much like it does on the iPad. Expect a small window at the bottom of the screen. The voice assistant would be less useful on a PC than on a smartphone or tablet, but it could come in handy for setting reminders or checking the weather. Voice dictation is already available on Macs, as it shipped with Mountain Lion.

Playing the long game

Despite early criticism, Tim Cook is playing the long game against Google (Digitally alter...
Despite early criticism, Tim Cook is playing the long game against Google (Digitally altered from original Shutterstock image)
So why bake two heavily-criticized features into OS X? Both are key to Apple's future: they're ammo in the war against Google. Maps has been a PR disaster, but if Apple wants to separate itself from all Google services, creating it was a necessary step.
Siri represents another front against Google: search. It's primitive now, but Siri could eventually be iPhone owners' primary way of retrieving information.
If Apple repeats last year's schedule, OS X 10.9 will enter public beta in February, and release in late July. Little else is known about the update, but we'll likely hear more within the next few months.
Source: 9to5Mac

Flipkart Offers Zone

Contact Form

Name

Email *

Message *