Internet Market Consulting is a full service Internet company. We specialize in providing businesses and individuals the ability to compete in a fast-paced Internet environment by designing, managing and maintaining comprehensive, professional, visually attractive websites.
Google never, ever does something
for no reason. Sometimes it’s just a matter of waiting patiently to figure out
what that reason is.
In May, Google created a fetch and render tool in Google Webmaster Tools that
was built to render web pages properly for GoogleBot. At the time, it was
unclear why the company was introducing the tool, though it hinted at future
plans that would involve fetch and render.
On Oct. 27, we got a definitive answer.
That fetch and render tool foreshadowed the introduction of new guidelines that
say you could be negatively impacted on search rankings and indexing when you
block your CSS or JavaScript files from being crawled. When you allow Googlebot
to access these things, as well as your image files, it will read your pages
correctly. When you don’t, you could hurt the way the algorithms render your
content and thus result in your page rankings declining.
So that tool that was put out a few months earlier was basically a warmup – it
can be used to make sure GoogleBot is rendering your web pages correctly.
It’s all part of a drive toward better user experience that is ultimately
behind the changes Google has made.
The Nitty-Gritty of the Changes
Google says the change was basically to make its indexing system more like a
modern browser, which have CSS and JavaScript turned on. So, as always,
Google’s claim is that it’s doing this for the greater good. It wants to make
sure it’s reading things just like the people who will be looking for your
content.
That’s a big change from before, when Google’s indexing systems were more like
text-only browsers. Google cites the example of Lynx. But the search engine
says that approach no longer made sense since modern browsers index based on
page rendering.
The search engine offers a few suggestions for optimal indexing, including:
•Getting rid of unnecessary downloads
•Merging your CSS and Javascript files
•Using the progressive enhancement guidelines in your web design
What This Means
With any Google change, the real question is what does this mean? How will it
impact webmasters and what sort of impact could it have on SEO?
Clearly the answer to that second question is sites that do not adhere to the
suggested guidelines will see their search results suffer. Make sure your
webmaster fully understands what Google is asking for, and discuss what type of
changes should be implemented and how they could affect Google rankings.
Your aim is to create crawlable content, and that means doing whatever Google
suggests. Use the fetch and render tool to make sure everything on your site is
in order. It will crawl and display your site just as it would come up in your
target audience’s browsers.
The tool will gather all your resources: CSS files, Javascript files, pictures.
Then it runs the code to render your page’s layout in an image. Once that has
come up, you can do some detective work. Is Googlebot seeing the page in the
same way it is rendered on your browser?
If yes, you are in good shape. If no, you need to figure out what tweaks to
make so that Google is seeing the same thing you are. Here are potential
problems that could be making your site’s content non-crawlable:
* Your website is blocking Javascript or CSS
* Your server can’t handle the number of crawl requests you receive
* Your Javascript is removing content from your pages
* Your Javascript is too complex and is stopping the pages from rendering
correctly
Why These Changes, Why Now
Google always has intent behind what it does, and here’s my read on its intent
with these changes: It’s making user experience a bigger factor in its search
rankings. Think about it. The emphasis on page loads and rendering are two
major steps in that direction.
That has also prompted speculation that the company could start using mobile
user experience for its rankings as well. There has been rampant speculation in
recent months, as mobile usage begins to overtake desktop, that Google will
begin shifting its focus to the mobile web for search engine optimization.
So could this be one of the first steps on the way to those big changes?
Perhaps. I always think it’s dangerous to try to get too many steps ahead of
Google; the search engine likes to reverse course and throw people off from
time to time. It does not like it when SEOs make changes in anticipation of its
actions, preferring to dictate the course itself. And I do think the idea
behind the crawlable-non-crawlable content changes makes sense. You have to
keep up with the times.
But others could argue that keeping up with the times is exactly what Google
will be doing by putting greater emphasis on mobile user experience.
Just when consumers were starting to regain some company trust and safe-shopping stability after last year’s massive Target breach, a string of new large-scale company breaches quickly reminded us consumers just how insecure our personal data can be.
Needless to say, it’s been a rough year for some major companies and an even rougher year for thousands of unlucky customers. Let’s look at three of the major breaches of the last couple of months.
Home Depot
(Source: Krebs On Security)
Early last month, reports started coming in that the home improvement giant was investigating “some unusual activity with regards to its customer data.” Security reporter Brian Krebs immediately called credit card breach, especially since multiple banks came out to say that they were seeing evidence that Home Depot was the likely source of a batch of stolen credit and debit cards that went on sale in the cybercrime black market that morning.
Sure enough, six days later, the company admitted that its payment systems were in fact breached and that the hack was going on for months. They went on to say that while credit card data was exposed, personal pins were not. Reassurance (not really). And while the exact number of affected cards wasn’t known at that time, one thing was for certain: If you used a credit card at one of Home Depot’s U.S or Canadian stores in the past 4-5 months, you needed to consider your credit card stolen and get on the phone with your bank ASAP.
About two weeks later (September 18th), Home Depot announced the number. A whopping 56 million cards were impacted, making the incident the biggest retail card breach…ever (on record, at least). The ‘silver lining’? Home Depot also said that the malware was now contained.
Japan Airlines
(Source: Google Images)
Before the month of September passed (and with Home Depot still fresh on everyone’s minds), another large company from a completely different industry had some bad news to share with its customers…
On September 30th, Japan Airlines (JAL) confirmed that as many as 750,000 JAL Mileage Bank (JMB) frequent flyer club members’ personal info was at risk thanks to a breach. Apparently, hackers were able to get into JAL’s ‘Customer Information Management System’ by installing malware onto computers that had access to the system. The data that was accessed? Everything from names to home addresses to birth dates to JMB member numbers and enrollment dates. The good news is that credit card numbers and passwords did not appear to be exposed.
There have not been any new developments about this breach, since the official statement by JAL on September 29th.
JP Morgan
(Source: Reuters)
October 2014 was only two days young when yet another major company confirmed a data breach. This time, the victim was JP Morgan. Or rather, JP Morgan customers who used Chase.com and J.P. Morgan Online websites, as well as the Chase and JP Morgan mobile apps.
Last Thursday, the nation’s largest bank revealed that a mid-August cyberattack exposed personal info for 76 million households, as well as 7 million small businesses. More specifically, names, email addresses, phone numbers and addresses were stolen, while JP Morgan went on to say that there was no evidence that account numbers, passwords, Social Security numbers or birth dates were exposed. While the bank found out about the breach of it’s servers in August, it has since been determined that it began as early as June.
Unfortunately, not much else is certain at this time. What we do know is that Russian hackers are suspected (still not confirmed), over 90 of JP Morgan’s servers were affected, and it is believed that nine other financial institutions were also targeted (although we don’t know their identities). The lack of concrete information is scary in it’s own right, but the fact that JP Morgan is staying mum on the matter is even more troubling. According to a Huffington Post report from earlier today, the bank is refusing to say how many people were actually hit by the breach, with spokeswoman Trish Wexler saying that JP Morgan isn’t offering more details beyond what was announced last Thursday. This could mean that the breach, already the largest (against a bank) in history, could potentially be even larger than the reported 76 million households and 7 million small businesses, keeping in mind that ‘households’ is not the same thing as ‘individuals’.
Additionally, Fox Business is reporting that the bank is now bracing for a massive-scale spear-phishing campaign in the wake of the breach. according to their sources. Considering that no bank info was compromised in the original breach (JP Morgan said in a statement that they haven’t “seen unusual fraud activity related to the incident”), this is a plausible next-step. Using the personal info obtained in the ‘first wave’, the attackers can send out legitimate-looking emails to the affected customers that say there is a problem with the user’s account and ask for Social Security numbers, passwords, etc. Alternatively, the emails could ask the customer to click an embedded link to update their account info, but in reality, the customer is taken to a official-looking fake site from which the attackers can nab the important financial information. In either case, the virtual trap is activated at that point.
What to do?
It’s no secret that data breaches are on a steep rise. According to a the Identity Theft Research center, there have been 579 data breaches this year, 27.5% more than there were at this time last year. And that number is only going to continue to increase.
In any of these three breaches, it’s important for customers to take basic security steps to ensure their information is safe, whether that means calling your bank and getting a new credit card issued (in the case of Home Depot), changing your password if you’re a JAL frequent flyer and JMB club member, or changing your log-in information and monitoring your online accounts if you bank with JP Morgan or Chase.
As more and more people choose to bank online (and become more internet-dependent in general), it’s also no secret that employing powerful and always up-to-date internet security on your devices is more crucial than ever before. Company breaches and spear-phishing attacks aren’t going anywhere. Take the necessary steps to keep your personal info protected!
There are two major types of marketing strategies. The first is known as mass marketing or “branding”.
The goal of this type of advertising is to remind customers and prospects about your brand as well as the products and services you offer.
The idea is that the more times you run ads from your brand, the more likely people are to have this brand at the top of their consciousness when they go to make a purchasing decision.
If you’ve seen the ads from major brands such as Coca Cola, Nike and Apple you’ll have experienced “image” marketing.
The vast majority of advertising falls into this category.
There’s no doubt that this type of marketing is effective, however it is very expensive to successfully pull off and takes a lot of time.
It requires you to saturate various types of advertising media e.g. TV, print, radio, Internet etc. on a very regular basis and over an extended period of time.
The expense and time involved are not a problem for the major brands as they have massive advertising budgets and product lines are planned years in advance.
However, a problem arises when small businesses try to imitate the big brands at this type of marketing.
The few times they run their ads is like a drop in the ocean. It’s nowhere near enough to reach the consciousness of their target market who are bombarded with thousands of marketing messages each day.
So they get drowned out and see little or no return for their investment.
Another advertising victim bites the dust.
It’s not that the small businesses aren’t good “branding” or mass media ads. It’s that they simply don’t have the budget to run their ads in sufficient volume to make them effective.
Unless you have millions of dollars in your marketing budget, you have a very high probability of failure with this type of marketing.
The second type of marketing strategy is called “direct response”.
Direct response marketing, is designed to evoke an immediate response and compel prospects to take some specific action, such as opting in to your email list, picking up the phone and calling for more information, placing an order or being directed to a web page.
So what makes a direct response ad? Here are some of the main characteristics:
It’s trackable. That is, when someone responds, you know which ad and which media was responsible for generating the response. This is in direct contrast to mass media or “brand” marketing – no one will ever know what ad compelled you to buy that can of Coke, heck you may not even know yourself.
It’s measurable. Since you know which ads are being responded to and how many sales you’ve received from each one, you can measure exactly how effective each ad is. You then drop or change ads that are not giving you a return on investment.
It uses compelling headlines and sales copy. Direct response marketing has a compelling message of strong interest to your chosen prospects. It uses attention grabbing headlines with strong sales copy that is “salesmanship in print”. Often the ad looks more like editorial than an ad (hence making it at least three times more likely to get read).
It targets a specific audience or niche. Prospects within specific verticals, geographic zones or niche markets are targeted. The ad aims to appeal to a narrow target market.
It makes a specific offer. Usually the ad makes a specific value-packed offer. Often the aim is not necessarily to sell anything from the ad but to simply get the prospect to take the next action, such as requesting a free report.
The offer focuses on the prospect rather than on the advertiser and talks about the prospect’s interests, desires, fears and frustrations.
By contrast mass media or “brand” marketing has a broad, one size fits all marketing message and is focused on the advertiser.
It demands a response.Direct response advertising (pay per click advertising) has a “call to action”, compelling the prospect to do something specific. It also includes a means of response and “capture” of these responses.
Interested, high probability prospects have easy ways to respond such as a regular phone number, a free recorded message line, a web site, a fax back form, a reply card or coupons.
When the prospect responds, as much of the person’s contact information as possible is captured so that they can be contacted beyond the initial response.
Multi-step, short term follow-up. In exchange for capturing the prospect’s details, valuable education and information on the prospect’s problem is offered. The information should carry with it a second “irresistible offer” – tied to whatever next step you want to prospect to take, such as calling to schedule an appointment or coming into the showroom or store. Then a series of follow-up “touches” via different media such as mail, e-mail, fax, phone are made. Often there is a time or quantity limit on the offer.
Maintenance follow-up of unconverted leads. People who do not respond within the short term follow-up period may have many reasons for not “maturing” into buyers immediately. There is value in this bank of slow-to-mature prospects. They should continue hearing from you once to several times a month.
Money At A Discount
Direct response marketing is a highly ethical way of selling. It’s focused on the specific problems of the prospect and aims to solve these problems with education and specific solutions.
It is also the only real way for a small business to affordably reach the consciousness of a prospect.
Your marketing system must deliver profitable results.
You have to know what a customer is worth to you, and then decide what you are reasonably willing to invest to acquire one, and then build systems that work within that limit.
Direct response is an accountable way to run marketing for a small business, as it is highly focused on return on investment.
If $10 bills were being sold for $2 each, how many would you buy?
The name of the game with direct response marketing is ‘money at a discount’ e.g. $2 into advertising to get $10 out in the way of profits from sales.
When you turn your ads into direct response ads, they become lead generating tools rather than just name recognition tools.
Facebook is at the vanguard of squeezing increased value from paid social media marketing (SMM) – and other networks are following
Twitter’s headquarters in San Francisco, CaliforniaPhotograph: Justin Sullivan/Getty Images
Marketers have been complaining for some time that organic reach no
longer exists on social media and that brands have to pay for adverts if
they want to engage with their target audience.
These rumblings have been growing louder over the past 12 months,
particularly in regards to Facebook, which some marketers claim is now
little more than a glorified advertising network. However, it could be
that Twitter will also become a so-called ‘pay-to-play’ network for
marketers, as it recently hinted at plans to implement a news feed-style algorithm.
This means users will no longer see tweets in one continuous
real-time feed, but will be shown the content that Twitter deems to be
the most important or relevant. Apparently the aim is to improve the
user experience, but a cynical might suggest Twitter also wants to
squeeze more ad revenue from brands by restricting their organic reach.
To give some context to this debate, it’s worth looking at the evidence that’s stacking up against Facebook.
Dwindling organic reach on Facebook
Much of the disquiet among marketers has been fueled by Forrester
research published in October 2013, which began with the bold statement
that “Facebook is failing marketers.” Based on a survey of 395
marketers, Forrester found that Facebook creates less business value
than any other digital marketing opportunity. Chart: Forrester Research highlights dissatisfaction with Facebook marketing.Photograph: Forrester Research
Forrester’s report analyst, Nate Elliot, claimed this
dissatisfaction was due to poor ad targeting capabilities that failed to
properly utilise social data, and a perceived failure to deliver
organic reach.
“Everyone who clicks the like button on a brand’s Facebook page
volunteers to receive that brand’s messages – but on average, you only
show each brand’s posts to 16% of its fans,” he wrote.
Separate data published by Ogilvy in March this year showed that organic reach on brand pages had plummeted to just 6%,
a sharp fall from 12% in October 2013. The situation is even bleaker
for pages with fewer than 500 fans, as they saw organic reach fall from
an already low 4% to just 2.1%. Ogilvy also quoted anonymous “Facebook
sources” as saying that it wouldn’t be long before organic reach hit
zero.
Ogilvy needn’t have relied on unnamed sources, as Facebook has been
relatively open about its desire to restrict organic reach. In a sales
deck distributed to ad partners at the end of 2013, Facebook stated:
“We expect organic distribution of an individual page’s posts to
gradually decline over time as we continually work to make sure people
have a meaningful experience on the site.” Chart: Facebook organic reach decreases.Photograph: Social@Ogilvy
Organic reach is being crowded out
In essence, brands are being crowded out of social media platforms as
content from publishers and people’s friends is given priority in the
news feed. The document went on to suggest that Facebook fans were no
longer a way to gain free exposure, but were instead a way of making ads
more powerful as they provide greater social context.
The Facebook report continued: “Your brand can fully benefit from
having fans when most of your ads show social context, which increases
advertising effectiveness and efficiency.” Social context in Facebook ads.Photograph: Facebook
Is the future ‘pay-to-play’?
It does seem that it’s becoming increasingly difficult for brands to
gain any organic exposure on Facebook. The evidence from third-party
research and Facebook’s own declarations make the future seem bleak for
those who hoped that the free ride of social advertising might continue.
News from other social networks also suggests a shift to the ‘pay-to-play’ model.
Twitter
On average, tweets only reach around 10% of followers as they are quickly drowned out by other posts,
according to data pulled from Twitter’s new analytics platform. As
mentioned, Twitter is chasing greater ad revenues and planning new
controls that dictate what content users see in their feeds, so it could
be that brands find their organic reach is further reduced.
Pinterest
Pinterest is also finally moving towards monetising its platform
through the use of ‘Promoted Pins’, which the company announced in June.
There is currently a waiting list for this function,
which may be the start of restricting what branded content users are
exposed to. The company’s last funding round valued the business at
$3.8bn (£2.3bn).
We’ve already seen that Google is willing to remove privileges from its free services – such as hiding keyword data in Google Analytics tools – in order to boost its ad revenues.
Organic social reach isn’t dead yet
In April, new research showed that while on average many Facebook brand pages have seen a drop in organic reach, the top 1% of pages still reached 82% of their fans.
It could be that it simply comes down to producing content that is both
relevant to your audience and tied to long-term business goals, rather
than chasing virality and looking for quick wins.
Ultimately, it’s a matter of wait and see for how the social media
networks develop their revenue streams and what this means for brand
content.
David Moth is deputy editor at Econsultancy where he covers ecommerce and digital marketing. Follow him on Twitter: @DavidMoth
Anyone who runs their own business knows how hard it is to keep up with a myriad of day to day tasks that go along with business ownership. When you’re juggling the responsibilities of being CEO, CMO and HR, chances are, your Internet presence is suffering. Websites are no longer the simplistic online brochures that they were 15 years ago; they have evolved into full scale business solutions. They are multi-functional tools that can be used to solve a variety of problems including those related to sales, marketing, logistics, customer service and public relations issues. In the last decade, a number of free content management systems (CMS) developed to enable anyone to establish a marketing presence on the world wide web. The rising popularity of CMS such as WordPress®, Joomla® and Drupal® has led to a dramatic increase in the number of do-it-yourselfers -- especially small business owners who decide to take on the roles of web manager, search engine optimization expert and Internet marketer. But,it’s not do-it-yourself if you’re not doing it.
Forward-thinking entrepreneurs KNOW that a good web manager is worth their weight in gold, working with you to grow your Internet business presence while allowing you to focus on the other essential aspects of owning and running a business. Website managers open the door to success by maintaining your company image, directing sales traffic, managing consumer relations and promoting your virtual real estate. The main focus of a website manager is to keep your website up-to-date and operating smoothly. A well-maintained and marketed website will provide a steady stream of business.
Now that you know how a website manager can help you grow your business, it’s important to know what types of skills to look for in a website manager. An effective web manager will, at a minimum, offer the following website management services:
Website maintenance & repair
Internet marketing
Search engine optimization
Content management
Email marketing campaign development & deployment
Social media marketing & management
Reputation Management
Not only should your website manager offer these services but they must know how to use them in the most effective ways to benefit your business RIGHT NOW.
One key aspect of locating the right website manager for your business is to ensure that they offer website maintenance services. This requires specialized skills, including in-depth knowledge of hypertext markup language (HTML), cascading style sheets (CSS) and Java to properly manage your website’s code and scripts. All website browsers use HTML and CSS to render text and graphics and nearly all websites use some form of Java to program or control a websites’ functionality. Today computers, mobile devices and smartphones use a multitude of Internet browsers. Poorly written code or invalid programming can cause your site to fail to function on one or more browsers. As importantly, this issue can delay or prevent search engines from indexing your website. While research shows 76% of consumers use search engines to locate local goods and services, error riddled websites are rarely included in search results due to their poor quality.
Since website management is such a critical job, it’s important to be sure that your website manager has professional level skills. There are two free tools that can help you evaluate the skills and expertise of a website manager or website management company. You can use these tools to examine previous work samples and see firsthand how effectively they market their own website.
1.) Website Code Validation Tool – Click on the link and enter any URL address to check a web page for code errors. This tool will display the number of errors on the page. The goal of any website is to have 0 errors to ensure optimal functionality and search engine indexing.
2.) PageRank® checker – Enter a web address to see the ranking of any page. This rating is an indication of your online marketing reputation. It’s also an indication of the online reputation of a potential web manager. You will want your website management provider to have a rank of 3 or above.
Understanding the nuances of search engine algorithms and optimizing websites properly is a mandatory skill for any website manager. 9 out of 10 Internet users in the United States say that Google is their preferred search engine. Imagine having your site banned by Google. This can be the result of working with an inexperienced or overzealous website manager. And, just because your site is completely developed and optimized when it’s launched, the work of a web manager is not finished. Web maintenance is an ongoing process. Your web manager must stay abreast of the many regular updates and changes required to keep the site properly optimized. This is crucial because if your customers and prospects cannot find you by using popular search engines, your business will be lost on the over-populated Internet.
Effective website management will help your website content rank higher in search engine results. This is especially important since 75% percent of searchers don’t scroll past the first page of search engine results. Consumers pay the most attention to results displayed at the top of the search engine results page. If your business doesn’t rank well in organic (unpaid) search results, or if a search yields negative or unflattering reviews for example, you’re losing a lot of potential customers looking for trustworthy, quality businesses like yours. A good web manager can help by creating informative, useful, and interesting content that contains relevant keywords ensuring your business is positioned well in search engine results.
To maximize the results of your website marketing strategy, focus your resources on the avenues with the highest return on investment for your business including a social media campaign. Are you a beauty supply company wasting valuable hours a week focusing on Twitter posts? Perhaps you heard the latest buzz about Pinterest or Instagram so you devote a few hours a week to posting information about your building supply company. Without social media management (SMM), however, you may actually be wasting your valuable time trying to attract the wrong audience. A social media manager can help you to focus efforts on the best media platforms to bring the fastest return on investment for your business.
Reputation management is another key component of your social media strategy. Reputation management is something to think about before your business is negatively impacted -when orders dry up or your phones stop ringing. You need to know what people are saying about your company in real time. Monitoring social media can alert you when someone mentions you or your employees on popular sites like Yelp, Facebook or Twitter. Staying abreast of what others are saying about your business puts you in the best position to respond in the most appropriate way to maintain your professional reputation. You may not always be able to recover the customer but you can absolutely prevent a reoccurrence.
Businesses have a lot of options when it comes to website management. Let IMC Website Management take care of maintenance, repairs and promotion. We'll manage every aspect of your website including complicated upgrades, repairs and security. We don't stop there; we take care of getting the word out about your website by effectively marketing your site to search engines, directories and social media.
Search engines – webmasters and search engine optimization (SEO) professionals follow their guidelines for the highest possible rankings
on them; paid search marketers pay to be featured on them; and users
turn to them when they're searching for answers, information, or
entertainment.
Search Engine Watch has been covering search engines since June 1997
and has watched the industry evolve to its current state. Over time,
many search engines have come and gone, as users have spoken with their
keyboards (and literally with their voices – thanks to voice search
technology).
In recent years, search market share has remained mostly unchanged –
for much of the world, it's Google followed by every other search engine
(in the U.S. the "Big 5" search engines consist of Google, Bing, Yahoo,
Ask.com and AOL, which combine for hundreds of billions of searches
every month). Meanwhile, many of the players have consolidated or have
become footnotes in history.
What follows is an overview of today's major global search engines,
with some history and explanation of why each one is important to
webmasters, marketers, and users. We'll then review some of the top
directories.
Started in 1998 as a university project by Stanford University students Sergey
Brin and Larry Page, Google is now the dominant search engine by no
small margin and that didn't evolve slowly.
In fact, in June of 1999 Netscape Search was updated and AOL/Netscape search began to be powered by Google bringing their search volume to approximately 3 million per day; huge for the time.
On June 26, 2000 Yahoo Selected Google to provide its organic search
results (replacing Inktomi) with its impressive index of more than 25
million pages; again, huge for the time.
Google has since become synonymous with the word "search" and as most
of us know, is often used in place of the word. Don't know the answer?
Google it!
The continued strength of Google as a search provider is based on a
large number of factors and won't be debated here, save to say, they
have successfully provided the results people are looking for in a
manner those searchers either enjoy or are comfortable enough with not
to switch to a different provider.
Google continues to tweak their search algorithm multiple times per
month and adjust the layout of their results to test for improved
visitor experience and advertising revenue.
The majority of Google's revenue is derived from their AdWords and
AdSense programs. In fact, advertising accounts for more than 95 percent
of Google's earnings. If there is a weakness in the Google model this
is it; they need to tweak their layout and results to promote the paid
avenues of their offerings. This gives advantages to other engines who
may have revenue generation strategies outside of search.
Bing was launched in May 2009 as a fundamental upgrade from Microsoft's previous efforts into search, MSN Search.
Since the launch of Bing, Microsoft's share of the search marketplace
has more than doubled. Add to that the deal between Microsoft and Yahoo
for Bing to power Yahoo's organic results and Bing powers over 25
percent of search.
With the Microsoft/Yahoo alliance also came the affect that Yahoo's
paid search platform would be used to power both Yahoo and Bing's paid
results. While this may not seem like a big deal on the surface, it is
actually huge.
Where once business owners and marketers had to consider whether it
was worth the hassle of managing both a Bing paid campaign (for the
significantly lower traffic they yield over Google) and also make the
same call on managing a Yahoo paid campaign – the two now are manageable
in one convenient location, significantly reducing the time it takes to
setup and manage.
This of course makes the cost for these campaigns less expensive and
when you combine that with Bing's increased market-share then they're in
a position to take some of the ad dollars from Google (or at least,
gain some for themselves).
Yahoo Yahoo is an interesting search engine and one which, until recently, I had a very hard time taking seriously.
Once upon a time Yahoo was a major leader in the search field but has
been in decline, making bad decision after bad decision, announcing layoff round after layoff round
and making what can only be described as one of the worst business
decisions in history when they turned down a takeover from Microsoft
valued at $33/share. Yahoo shares dove after that and have never been
anywhere close since.
From that point, until 2012, it seemed that every piece of news from
Yahoo was bad news, until July 16 when the announcement came that they
had snagged Marissa Meyer from Google to become CEO. This was the first
move they'd made in a long time and had people wondering if this might
just be the breath of fresh air and change of direction that the company
needed.
From reviews of all hires and selling key properties such as Alibaba
to putting their own search technology back on the forefront; Yahoo has
maintained its position as one of the top three search engines, despite
not producing their own organic results.
Other Major Search Engines Around the Globe
Google dominates the U.S. and most of the world – but not everywhere.
Yahoo and Bing have had about the same luck (zero) making a dent in
Google's search market share on other continents, but a couple of search
engines in other countries have managed to stay ahead of the Mountain
View, California-based search engine. If you're from these countries or
interested in marketing to them – pay attention.
Baidu In China, Baidu is the major player with more than three of every four searches conducted on their engine.
To say Baidu blends organic with paid search is misleading, they use a
hybrid approach wherein they have pay for performance (P4P) results
(users bid to have their websites place at the top of what would appear
to be the organic results).
In addition, Baidu offers PPC which, similar to AdWords, is displayed
at the top or right of the standard results. One could argue that the
existence of the PPC-like results further confuses the users clicking on
the standard results area into believing they are organically
generated.
While some investors consider Baidu to be overvalued as a stock,
their earnings are consistently high. For companies looking to market
into China, understanding Baidu is crucial.
Yandex Yandex is the primary and most popular of all Russian-language search engines with significant market dominance in Russia.
On October 1, 2012, Yandex launched their own browser and mobile app
to keep their position secure against Google, their only real competitor
in the space.
Yandex's advantage in Russian seems to be based on an algorithm that
performs much better in understanding the unique syntax used and
integrating that into the consideration of what type of results the user
is likely looking for (for example – is the search string a question or
simply keyword entry).
Directories
Directories are an interesting topic. Do they carry weight? Can they
hurt your rankings? Should you even bother? The answer here is yes, yes,
and yes.
This section will mainly focus on general directories, but the end of
this section does include a few tips on how to find niche directories
(or even other general directories) and how to determine if they are
worth getting a listing on.
Yahoo
The Yahoo Directory was started in 1994 under the name "Jerry and
David's Guide to the World Wide Web" but in 1996 became Yahoo. At the
time Yahoo was primarily a directory with search functionality and
(interestingly) neither SEO nor Internet Marketing were even categories
at the time.
Through the late 1990s Yahoo pushed to become a web portal and in
2000 even signed a deal with Google that would see Google power Yahoo's
search functionality. Their focus at the time was to acquire users
through acquisitions such as GeoCities (RIP),
bringing more people into their portal and keeping them there.
Unfortunately Yahoo! didn't have the same user loyalty that Apple does
and the walled-garden approached failed as users Googled their way out
of the Yahoo network of sites (ironically right on Yahoo's own
properties).
All this said however, they still provide a solid directory (back to their roots). The cost is a non-refundable $299 review fee.
BOTW Best
Of The Web may be my favorite of the general directories due in no
small part to the fact that they allow for a permanent listing. The
directory was founded in 1994 as a listing of the best of the web (seems
to be the year of directories) and actually gave out a series of awards
(take a peek, it's interesting to see what types of things won back
then). That lasted until 1998 at which time the site lay dormant until
purchased in 2002 at which time it became a general web directory.
BOTW is a human edited directory. They will decline your listing if
they don't like the site. A submission is $150 annually or $300 for a
permanent listing.
DMOZ No
list of directories would be complete without DMOZ. DMOZ was founded in
June 1998 as Gnuhoo. It was purchased by Netscape in October of the
same year at which time it became The Open Directory Project. By April
2000 it had surpassed the Yahoo Directory in number of URLs in it's
index and currently sits at about 5.2 million.
For those in the industry long enough to remember, DMOZ suffered a
catastrophic failure in October of 2006 at which time they had to
display a backup version of their directory. This wasn't remedied until
December and new sites couldn't be suggested until January. This is the
time when it seemingly became increasingly difficult to get a listing in
DMOZ as any editors seemed to have found new things to do with their
time.
It is still possible to get a listing in DMOZ. For the 10 minutes it
takes, it's well worth the time and it's free to submit. (Tip: try to
submit to a category that has an editor.)
Business.com Business.com
was started in 1999 as a search engine for business and corporations.
They came close to bankruptcy during the dot-com bubble bursting but
after major layoffs and restructuring they became profitable once more
in 2003.
Business.com is focused on business-to-business resources (so take
that into consideration when thinking about submitting. The cost is $299
per year and all submissions are reviewed manually.
As with Yahoo and BOTW, the fee is non-refundable if your site isn't accepted. You're paying for the review, not the link.
Honorable Mentions
Moving past the major players, there are also a number of other good
general directories. These directories have all survived many updates
including the Penguin and Panda rounds.
Remember, though, link building is about balance. You don't want to
submit to a bunch of directories and consider your job done. A better
strategy would be to bookmark this page, submit to a few and as you're
building more links using different strategies, add a directory or two
mixed in with the rest.
Again, this list only contains consistently solid general directories.
Directory Guidelines
You'll want to also look at niche directories (which may well hold
more weight than any of the general directories above), but you need to
be careful. There are many horrible directories out there.
Here are a few directory guidelines to follow that universally apply:
Is the submission a guaranteed placement? If a directory will list
you automatically (with or without a fee) then it's not an editorial
link and either doesn't carry weight or likely won't in the near future.
It should be avoided.
Do they require a link back? If they do (even for their free listings when a paid is available), it probably should be avoided.
Is their PageRank 3 or below? Yes, it's an old metric, but is still
helpful to gauge general site health. A directory with a PageRank of 3
or less will, at best, pass virtually no weight; at worst, it'll cause
you problems. Generally, you should only look at PageRank 3 directories
in the case of niche directories; with general directories, don't even
consider anything less than a 4.
Common sense. Ah, the toughest one because our brains can trick us
into seeing what we want to see. When you look at a directory (or any
other link source for that matter) you have to ask yourself, "does it
make sense that this link should pass weight to my site?" If you can
honestly say "yes" to this then it's likely a good link.
A Final Warning
The saying "don't put all your eggs in one basket" comes into play
here. Once again, directories can provide good and relevant links to
your site (and hey, even some traffic) but a solid link profile contains
variety.
Never put all your energies into one single link source. If you find a
lot of great niche directories, put them all on a list and add a couple
each month while you're engaged in other strategies to help remind
Google that you're not a one-trick pony. You have good content liked by
directory editors, bloggers, social media netizens, and others.
A newly discovered vulnerability in
OpenSSL, one of the most commonly used implementations of the SSL and TLS
cryptographic protocols, presents an immediate and serious danger to any
unpatched server. The bug, known as Heartbleed, allows attackers to intercept
secure communications and steal sensitive information such as login credentials,
personal data, or even decryption keys.
Heartbleed, or the OpenSSL TLS 'heartbeat' Extension
Information Disclosure Vulnerability (CVE-2014-0160), affects a component of
OpenSSL known as Heartbeat. OpenSSL is one of the most widely used, open source
implementations of the SSL (Secure Sockets Layer) and TLS (Transport Layer
Security) protocols.
Heartbeat is an extension to the TLS
protocol that allows a TLS session to be kept alive, even if no real
communication has occurred for some time. The feature will verify that both
computers are still connected and available for communication. It also saves the
user the trouble of having to reenter their credentials to establish another
secure connection if the original connection is dropped.
How does it work? Heartbeat sends a
message to the OpenSSL server, which in turn relays that message back to the
sender, verifying the connection. The message contains two components, a packet
of data known as the payload which can be up to 64KB and information on the size
of the payload.
However, the Heartbleed vulnerability in
OpenSSL allows an attacker to spoof the information on the payload size. For
example, they could send a payload of just one kilobyte in size, but state that
it is 64KB.
How an OpenSSL server deals with this
malformed Heartbeat message is key to the danger this vulnerability poses. It
does not attempt to verify that the payload is the same size as stated by the
message. Instead it assumes that the payload is the correct size and attempts to
send it back to the computer it came from. However, since it doesn’t have the
full 64KB of data it will instead automatically “pad out” the payload with data
stored next to it in the application’s memory. If the server received a 1KB
payload, it will thus send it back along with 63KB of other data stored in its
memory. This could include the login credentials of a user, personal data, or
even, in some cases, session and private encryption keys.
The data the application sends back is
random and it is possible that the attacker may receive some incomplete or
useless pieces of data. However, the nature of the vulnerability means that the
attack can be performed again and again, meaning the attacker can build a bigger
picture of the data stored by the application over time.
Private encryption keys may be the most
difficult thing to steal using this attack. Data is stored in a sequential
fashion, with new data stored in front of older data. Encryption keys will
usually be stored “behind” the payload in memory, meaning they are less likely
to be accessed. Content from current SSL/TLS sessions is the type of data most
likely to be at risk.
The Heartbleed bug is the latest in a
series of SSL/TLS vulnerabilities uncovered this year. TLS and its older
predecessor SSL are both secure protocols for Internet communication and work by
encrypting traffic between two computers.
In March, a certificate vulnerability was
found in security library GnuTLS, which is used in a large number of Linux
versions, including Red Hat desktop and server products, and Ubuntu and Debian
distributions of the operating system.
GnuTLS is an open source software
implementation of SSL/TLS. The bug meant that GnuTLS failed to correctly handle
some errors that could occur when verifying a security certificate. This could
allow an attacker to use a specially crafted certificate to trick GnuTLS into
trusting a malicious website. The vulnerability
was immediately patched by GnuTLS.
Heartbleed is by far the most serious
vulnerability in SSL/TLS to be uncovered of late. The nature of the bug and the
fact that affects one of the most widely used implementations of SSL/TLS means
that it poses an immediate risk.
Advice for businesses:
This is a vulnerability of the OpenSSL
library, and not a flaw with SSL/TLS nor certificates issued by Symantec.
Anyone using OpenSSL 1.0.1 through 1.0.1f
should update to the latest fixed version of the software (1.0.1g), or recompile
OpenSSL without the heartbeat extension
After moving to a fixed version of
OpenSSL, if you believe your web server certificates may have been compromised
or stolen as a result of exploitation, contact the certificate authority for a
replacement
Finally, and as a best practice,
businesses should also consider resetting end-user passwords that may have been
visible in a compromised server memory
Advice for consumers:
You should be aware that your data could
have been seen by a third party if you used a vulnerable service provider
Monitor any notices from the vendors you
use. Once a vulnerable vendor has communicated customers that they should change
their passwords, users should do so
Avoid potential phishing emails from
attackers asking you to update your password – to avoid going to an impersonated
website, stick with the official site domain
Stick to reputable websites and services.
They are most likely to have immediately addressed the vulnerability
Monitor your bank and credit card
statements to check for any unusual transactions
UPDATE April 10,
2014: Symantec’s SSL Tools Certificate Checker will check whether a
website is vulnerable to exploitation. You can access the Certificate Checker at
the following location:https://ssltools.websecurity.symantec.com/checker/
To use the Certificate Checker, click
on Check your cerftificate installation and then enter your
website URL.
It has come to our attention that some hosting companies are force upgrading or suspending Joomla 1.5 users citing security concerns.
We would like to clear the air, even though Joomla 1.5 is not officially supported any more, it is still a secure system as long as you have a patch for the one reported issue in place. There are no known vulnerabilities with Joomla 1.5.26.
Site Owners using Joomla! 1.5, what are your options?
In an ideal world, website owners should already be migrated to Joomla 2.5.x or 3.x or at least in the planning stages to do so by now. But for whatever reasons you cannot do a migration immediately, it means you need to communicate with your hosting provider to make sure they support Joomla 1.5.
If you are a Joomla site owner with a site on one of the hosts forcing a migration, here are your options:
Move to a host that is willing to support a Joomla 1.5 installation
Make sure your Joomla site is running the latest stable version, 1.5.26, and also has the patch applied
Make sure if you have any third party extensions installed that they are running the latest version — Third party extensions are also vulnerable to security issues
If you do not wish to support Joomla 1.5 sites anymore, we recommend educating users on website security and maintenance best practices. Using a forceful one click script updater is not recommended and may create additional problems.
Reach out to Joomla experts who can help you keep your Joomla 1.5 sites secure & also help you with a communication strategy to promote upgrades to the latest supported versions.
Understand that migrations are complicated and can break thousands of sites if you do a ‘one solution fits all’ migration which will leave you with a huge number of support issues
The 1.5 to 2.5 migration can be particularly challenging as many templates and extensions need to be re-written or updated
Users will need a sufficient notice period to allow them to plan for the migration of their website - especially if they need to bring in Joomla experts to do this work for them.
So the bottom line is that if you do have a Joomla! 1.5 site, its not the end of the world. You can keep running it as long as you keep it secure but site owners should plan on migrating to a stable supported version as soon as possible.
When Tom Telford helped found a vacation rental management company, Blue Creek Cabins, in 2001, he wanted a quick and easy way to connect with people looking to rent the 20 cabins he and his partner managed in and around the mountains of picturesque Helen, Ga.
Tom Telford built his cabin rental business using Google AdWords, but later changed his online ad strategy.
That is when he heard about a program calledAdWordsbeing offered by a new company,Google. Finding the system relatively easy to use, Mr. Telford selected a fewkeywords, like “Helen GA cabin rentals,” and agreed to pay Google 60 cents every time someone performed a search and clicked on his ad.
Before long, the calls and e-mails started pouring in. “The results were phenomenal,” said Mr. Telford, whose company is used by property owners to market their cabins. Encouraged, he invested more in his pay-per-click advertising efforts, which in time included similar programs offered byBing and Yahoo.
By 2010, Mr. Telford had started a new management company,Cedar Creek Cabin Rentals, and was spending $140,000 a year on pay-per-click advertising to promote the 45 cabins in his charge. The programs had become increasingly popular and competitive, which meant that in order to retain his ranking in search results, he had to pay about $1.25 a click, double what he had paid initially. “The cost per keyword climbed dramatically over the years,” he said. “And it’s still going.”
And that is a problem. While Mr. Telford agreed to pay more for his keywords, he said he did not see a commensurate increase in sales. “For a while, I was spending more than I was getting,” he said. “It finally hit me to ask, ‘Can I sustain this?’ ”
This concern has become increasingly common as online advertising has become a standard channel for large companies. Attracting those additional advertisers has been great for Google, which reporteda 42 percent increase in paid clicks, year over year, for the second quarter of 2012. But the heightened competition has driven up the prices for keywords and made it harder for small companies like Mr. Telford’s.
While about 96 percent of pay-per-click advertisers spend less than $10,000 a month, according toAdGooroo, a research firm that studies the pay-per-click market, big-budget advertisers spend hundreds of times more. In the first half of 2012, Amazon reportedly spent $54 million, and the University of Phoenix $37.9 million. “AdWords can bleed many a small business dry,” said Sharon Geltner, an analyst at theSmall Business Development Centerat Palm Beach State College in Boca Raton, Fla.
“The only way for smaller advertisers to get an edge is to spend a lot of time improving the quality and relevance of their ads,” said Richard Stokes, author of “Ultimate Guide to Pay-Per-Click Advertising” and the founder of AdGooroo. “The problem is that everyone else is doing that as well.”
Until recently, Byron Udell, founder and chief executive ofAccuQuote, a life insurance agency based in Wheeling, Ill., was spending several million dollars a year on pay-per-click campaigns. But after watching the price of keywords like “life insurance” rise to more than $20 from about $1 over the last 10 years, he decided to scale back greatly.
“The cost to get someone just to visit your Web site has, in some cases, become prohibitive,” Mr. Udell said. “Something that cost $3 might be a no-brainer, but at $20 it becomes absurd. It’s basic math, and if it doesn’t add up, we won’t do it.” He said he planned to redirect some of his advertising dollars to print, television and radio.
Google does not dispute the accounts of owners like Mr. Udell. A Google spokesman released a statement saying that small businesses can compete by making their ads more relevant to consumers and that they should use multiple strategies to pursue customers: “search, social media, earned media and more.”
Many analysts agree. “AdWords is still doable and reasonably profitable for local businesses or those that have narrow niches and high barriers to entry,” said Perry Marshall, the author of “Ultimate Guide to Google AdWords.” “But you cannot put all your eggs in one basket. The ultimate goal for any business should be to drive as much unpaid traffic to their site as possible.”
The increased demand for unpaid, or organic, search results has given rise to an entire industry specializing in search engine optimization, or S.E.O., with countlessprofessed expertswho promise to improve a Web site’s search ranking.
Mr. Telford said he was approached by dozens of such experts. “My competitors were inching up in organic traffic because I wasn’t doing anything,” he said. “But I also wasn’t comfortable hiring a S.E.O. expert, because none of them could explain exactly why what they were doing would work. It felt like they were selling me black magic.”
As he looked for alternatives, Mr. Telford came across a number of companies like RhinoSEO,Marketo,EloquaandPardot, which sell online services that promise to automate a company’s marketing efforts and improve organic search results. The basic idea, Mr. Telford concluded, was that investing in social media content like blogs and Facebook pages could attract unpaid traffic.
“It hit me like a brick, because I finally understood how you get better search results by creating content around the keywords people are searching for,” he said. “As we become more relevant to Google, our quality score improves in our AdWords campaigns. This enables us to bid lower, yet because we’re more relevant, we pay less per click.”
As a small-business owner without a full-time marketing staff, Mr. Telford wanted a tool that could help him manage both his social media content and his pay-per-click expenditures, which he planned to continue on a much-reduced basis. After conducting his research, he chose to sign up for the services offered by a company calledHubSpot, which is based in Boston.
Available online as software-as-a-service, HubSpot helps business owners set up a blog and optimize it to be recognized by search engines. The site, which has more than 8,000 customers, most of whom pay $200 to $1,000 a month, helps users populate and manage their Twitter, Facebook and LinkedIn accounts, along with any pay-per-click campaigns. It also tracks visitors and helps subscribers calculate the return on investment for their marketing initiatives.
Even though Google is one of its investors, HubSpot cut back on its own pay-per-click expenditures after realizing that organic searches were accounting for 60 percent more traffic than paid searches. “Most of our paid efforts shifted to platforms like LinkedIn, where we could target for the right kinds of job titles in line with our target customer profiles,” said Dan Slagen, who is in charge of advertising at HubSpot.
In March 2011, Mr. Telford started blogging through HubSpot about topics like where to find the bestfishing holes, and despite fears of a devastating loss of traffic, he reduced his pay-per-click budget to $100,000. By the beginning of 2012, some six months after he began blogging roughly five times a week, his organic traffic was up 91 percent over the previous year. And the number of conversions, or visitors who took an action on the site, had increased 37 percent.
Mr. Telford was so encouraged that he cut his pay-per-click budget again, to $33,000. “I still love Google because they got me there,” he said. “But that ride can’t last forever.”