JQuery Upgrade Causing Trouble for Twitter?

first_imgTags:#hack#tips How to Write a Welcome Email to New Employees? 7 Types of Video that will Make a Massive Impac… Twitter engineer Dustin Diaz tweeted this evening “Turns out #NewTwitter has some major slowness due to upgrading from jQuery 1.4.2 to 1.4.4 – We will be downgrading.” According to Diaz, users have been complaining for the past couple weeks that the service has been slow.Diaz provided more details via Twitter for those curious about the details. Diaz’s comments might prove useful to others working with JQuery. Why You Love Online Quizzes Growing Phone Scams: 5 Tips To Avoid The new version of JQuery was released in November. Is anyone else having trouble with it? Related Posts klint finleylast_img

2013: Say Goodbye To The Traditional Data Center

first_imgRelated Posts 3 Areas of Your Business that Need Tech Now Tags:#ARM#Big Data#Data Centers#predictions#Structured Data brian proffitt IT + Project Management: A Love Affaircenter_img Massive Non-Desk Workforce is an Opportunity fo… In 2013, traditional data centers will begin to lose their dominant status within the data-management food chain. They will increasingly be replaced by big-data software and lower-cost, ARM-based systems-on-chips. When thinking about the future of data centers, the problem is one of scale. For the past few decades, relational databases and the attendant hardware that runs them have been able to manage pretty much anything a company could throw at them, but those days are coming to an end.When Relational Ruled The LandIn the beginning, and for the first 20 years or so, data was heavily transactional, and was managed in discrete and very secureways. Speed was less important than making sure the data was safe as houses.In the 1990s, data began to be used in a slightly different way, as comapnies placed analytical demands on the data being gathered. Instead of being retreived in discrete packages, data became as a strategic asset to be analyzed, leading to the disciplines of business intelligence. Databases grew into massive data warehouses, and parallel querying arose as the only way to effectively manage the staggering workloads placed on information technology.Through the early years of electronic data, growth in the volume of data may have been rapid, but data tools and infrastructure were pretty much able to keep pace.That’s not so true anymore. Software soon will not be able to cope with the overwhelming volume of data being generated, says Mike Hoskins, chief technology officer of Pervasive Software. What’s coming is a real break in how data is managed.Breaking The Old ModelTo give an idea of what kind of scale we’re talking about, Hoskins points to U.S. retailer Wal-Mart‘s estimated 1-petabyte data store.“That’s the accumulation of 40 years of Wal-Mart sized business,” he said. “Facebook? Facebook generates that much data in a week.”There’s always a collection of data behind each transaction. But in e-commerce today, a customer can be clicking around quite a bit before buying, which leads to useful data sets tens, hundreds or thousands of times larger than “so-and-so bought widget X with credit card Y.” Add the fact that the machines handling these activities are also recording machine-to-machine transactions, and the data workload explodes beyond the capacity of any traditional data center. “We are reaching the end of the useful life” of our data centers, Hoskins said. “The bottom line is, it’s a death march.”Even if conventional software could manage this explosion, no company could afford it. Not to mention the energy costs invovled in buying, running and cooling the hardware.Indeed, it is innovation in hardware that’s going to provide the evolutionary break that Big Data requires. Servers with ARM-based processors, which absorb something like 20 times less power than Intel-based processors, are the next wave in data center infrastructure. Lower power requirements, after all, mean less resistance and less heat. Less heat means less money wasted on cooling and the ability to compress ARM-based systems closer together.As energy and general hardware costs coem down, hardware is lined up to take care of the new data workloads of this new massive scale of data.First Hardware – Then SoftwareOn the software side, Big Data will increasingly be handled by Hadoop systems that can store data and manage and analyze Facebook-scale loads.If you’re wondering why this is supposed to be big news, think about it this way: Relational databases have been handling data of all shapes and sizes for decades, and now there will be a certain level of data that the traditional data center architecture will simply be unable to handle. It’s the first stratification of data management. On one level of data management, relational databases will still be around, supporting smaller, less complex and more tactical workloads. But on this new level, whole new architectures will be created to deal with this scale.Big Data in the form of Hadoop-based architectures is but the first step into the future. In the past, data managers had to heavily pre-process data to get it to fit within a certain schema for use in a relational database. Today, they’re foregoing the pre-processing and are shoving the unformatted data into commodity Hadoop clusters. To perform analytical work, data managers are pulling refined data back into databases and other analytical tools. What’s The Data Center Endgame?This half-way approach is not the end game, though.Eventually, Hoskins believes, tools will be built into the Hadoop framework that will enable data managers to run applications and analysis right where the data lives, inside the Hadoop clusters.It’s no accident then that the latest iteration of one of Hadoop’s core components – MapReduce 2.0, code-named YARN – includes the beginnings of a framework that will let developers build exactly those kinds of tools inside Hadoop. This is something that the VP of Apache Hadoop Arun Murthy confirmed to me early this year at the Strata Conference in Santa Clara, California. When the YARN application framework is robust enough, Hadoop will be able to let developers code those applications.This will be the new way of working with data as it gets too big for relational databases to handle: a new architecture of low-cost, low-power servers that will keep applications and data as close to each other as possible, in order to maximize efficiency and speed.“Relational database technology has had a good run,” Hoskins said. But the days of the relational database being a part of every data solution are fading fast, as a new kind of data center becomes the new sheriff in town.Image courtesy of Shutterstock. Cognitive Automation is the Immediate Future of…last_img read more

Rep LaFave notes states progress but work to be done for UP

first_img Categories: LaFave News,News 25Jan Rep. LaFave notes state’s progress, but work to be done for U.P. families in 2018 State Rep. Beau LaFave agreed with many of the remarks made by Gov. Rick Snyder during his 2018 State of the State speech Tuesday at the state Capitol, while also acknowledging much needs to be done for residents in south-central Upper Peninsula.Regarding the state’s progress made since Snyder became governor in 2010, LaFave agreed Michigan is a transformed and much stronger state from eight years ago.“Michigan has made it a long way since the ‘Lost Decade’ of the Granholm administration,” said LaFave, of Iron Mountain. “The strides we have made have been incredible. We’ve created over 500,000 private sector jobs. We have been able to lower taxes, while also growing our rainy day fund from basically nothing to nearly a billion dollars. That gives us a much stronger foundation to help build our state for an even brighter and prosperous future.”Snyder’s speech noted local programs across the state, with one ‘shout out’ – LaFave’s favorite moment of the night – going to Delta County.“We all know about the great accomplishments that have come from the Angel Program,” LaFave said. “Delta County and prosecutor Phil Strom really deserved our governor’s ‘shout out’ for this initiative to help those seeking to overcome drug addiction in our community. It’s doing tremendous work and obviously is getting notice statewide.”LaFave also hopes Snyder will support House legislation to help foster growing workforce training legislation, in particular better cooperation between education and local businesses.“That is the opportunity we need in Dickinson, Menominee and Delta counties, getting our businesses working closer with our educators to help fill local jobs and keep our future growing here,” LaFave said. “That is why I submitted legislation, such as House Bill 4106 to grant academic credit to high school students for internships, because that will strengthen and build our communities.”As for the rest of 2018, LaFave is already working hard on many key issues.“We need to continue what we were sent to Lansing to do,” LaFave said. “We must continue to budget responsibly, improve our focus on workforce training, and decrease taxes and regulations. I want us to improve our dual enrollment programs with community colleges, expand broadband evenly across the state, and find a way to move Michigan forward in a civil and responsible manner. That’s all within our grasp.”LaFave was disappointed by one omission during the speech, and it’s a top priority for him: the reform of car insurance with the focus on reducing rates and give residents coverage options.“It was surprising that a key barrier preventing us from drawing more people and businesses to Michigan is having the highest rates in the country, but that wasn’t even mentioned,” LaFave said. “This is an incredibly important issue that gouges into the wallets of many U.P. families. I will focus on this reform every day in 2018 until we get the job done.”#####last_img read more