Welcome to web 2.0 Blog

The Last news of the web Development from the world master's

Design a Retro Style Ribbon

Posted by web2.0 Design resourse Thursday, September 17, 2009 1 comments

Design a Retro Style Ribbon
Are you interesting in retro style? I have one interesting tutorial for you how to make a retro style ribbon by yourself.
Start by making a new document in Photoshop – around 500×500 pixels in size, or, you know, whatever you prefer and fill it with color of #dac8aa.
Design a Retro Style Ribbon 01
Now, find some appropriate texture to put it on the canvas. You can find a lot of interesting textures or feel free to use mine. Open this texture and copy to the canvas. Then desaturate it with Image > Adjustments > Desaturate:
Design a Retro Style Ribbon 02
After that change layer mode to Overlay and change opacity to 60%. Merge two of layers in one.
Design a Retro Style Ribbon 03
Ok, now I would like to sharpen of background. Use the Sharpen Tool to do this.
Design a Retro Style Ribbon 04
Ok, now we start to create the ribbon. First of all, get out the Rectangular Marquee Tool to create selection as on my picture below and fill in with green (#95b900) color on the new layer.
Design a Retro Style Ribbon 05
Add two more lines in the same way using yellow (#e9db03) and red (#b94a00) colors.
Design a Retro Style Ribbon 06
Design a Retro Style Ribbon 07
Remove the selection with Ctrl+D and duplicate layer using Ctrl+J, then use Ctrl+T to rotate copied layer as you can see below.
Design a Retro Style Ribbon 08
Also move to the original layer and rotate it to.
Design a Retro Style Ribbon 09
Now, select this layer with Select > Load Selection and drag with black to transparent gradient. It will be the shadow from upper part of the ribbon.
Design a Retro Style Ribbon 10
Remove the selection with Ctrl+D and merge two of these layers in one. After that get out the Polygonal Lasso Tool and select some part of ribbon in the bottom. Press Delete to clear selected part of image and remove the selection again.
Design a Retro Style Ribbon 11
I think, time to add the text our ribbon, give it some life, right? Get the Horizontal Type Tool out and write some text onto your ribbon. In the bottom image I’ve used a_MonumentoTitul for my font, if you don’t have this font just find a good another one. I also used the color of #efe6e0. After that rotate the text with Edit > Free Transform on the ribbon and duplicate four more times.
Design a Retro Style Ribbon 12
Then merge all text layers and ribbon layer in one and apply the Outer Glow layer style for this layer.
Design a Retro Style Ribbon 13
Now you should have something like this:
Design a Retro Style Ribbon 14
Apply Filter > Texture > Texturizer for this layer:
Design a Retro Style Ribbon 15
Now, we have a relief like this:
Design a Retro Style Ribbon 16
Then, move to the next step. To bring our ribbon three-dimensional effect use Dodge Tool with moderate settings (Brush: 125 px, Range: Highlights, Exposure: 20%) and do a little bit of dodge-work on it. We should have an image like below:
Design a Retro Style Ribbon 17
Select Burn Tool (Brush: 80 px, Range: Shadows, Exposure: 30%) to make some burn-work.
Design a Retro Style Ribbon 18
Move to the next step. We need to add the grid to ribbon. Create a new document with size of 3×3 pixels and draw the pattern by using Pencil Tool as on picture below.
Design a Retro Style Ribbon 19
Then go to Edit > Define Pattern and save it as pattern. After that go back to our main document, use Select > Load Selection to select ribbon area, then create a new layer, get out the Paint Bucket Tool and fill selected area with new pattern.
Design a Retro Style Ribbon 20
Remove selection with Ctrl+D and mess a little bit with layer mode and opacity. I tried Overlay and 40%.
Design a Retro Style Ribbon 21
Now, it will be better to add a shadow to the places where the arrows indicates.
Design a Retro Style Ribbon 22
Create a new layer under the layer with ribbon. Get out the Polygonal Lasso Tool to create selection as on my picture below and fill it with black to transparent gradient.
Design a Retro Style Ribbon 23
Remove selection using Select > Deselect and blur the shadow a little bit with the Blur Tool.
Design a Retro Style Ribbon 24
Add more shadow to other two places in he same way.
Design a Retro Style Ribbon 25
Looking quite good now. To finish off this tutorial, I think, we need to add some shadow to the background. Go to the background layer, select the Burn Tool (Brush: 300 px, Range: Shadows, Exposure: 30%) and make darker some parts of the background.
Design a Retro Style Ribbon 26
That is it for now! Our tutorial is done!
Design a Retro Style Ribbon 27

JavaScript: The Definitive Guide

Posted by web2.0 Design resourse 0 comments

 

 
JavaScript: The Definitive Guide By David Flanagan
Publisher: O’Reilly 1998-07-15 | 784 Pages | ISBN: 1565923928 | PDF (OCR from html) | 1.9 MB
JavaScript is a powerful scripting language that can be embedded directly in HTML; it allows you to create dynamic, interactive Web-based applications that run completely within a Web browser.JavaScript: The Definitive Guide provides a thorough description of the core JavaScript language and its client-side framework, complete with sophisticated examples that show you how to handle common tasks. The book also contains a definitive, in-depth reference section that covers every core and client-side JavaScript function, object, method, property, constructor, and event handler. This third edition of JavaScript: The Definitive Guide describes the latest version of the language, JavaScript 1.2, as supported by Netscape Navigator 4 and Internet Explorer 4. The book also covers JavaScript 1.1, which is the first industry-standard version known as ECMAScript.
Since the earliest days of Internet scripting, Web developers have considered JavaScript: The Definitive Guide an essential resource. David Flanagan’s approach, which combines tutorials and examples with easy-to-use syntax guides and object references, suits the typical programmer’s requirements nicely. The brand-new fourth edition of Flanagan’s “Rhino Book” includes coverage of JavaScript 1.5, JScript 5.5, ECMAScript 3, and the Document Object Model (DOM) Level 2 standard from the World Wide Web Consortium (W3C). Interestingly, the author has shifted away from specifying–as he did in earlier editions–what browsers support each bit of the language. Rather than say Netscape 3.0 supports the Image object while Internet Explorer 3.0 does not, he specifies that JavaScript 1.1 and JScript 3.0 support Image. More usefully, he specifies the contents of independent standards like ECMAScript, which encourages scripters to write applications for these standards and browser vendors to support them. As Flanagan says, JavaScript and its related subjects are very complex in their pure forms. It’s impossible to keep track of the differences among half a dozen vendors’ generally similar implementations. Nonetheless, a lot of examples make reference to specific browsers’ capabilities.
Though he does not cover server-side APIs, Flanagan has chosen to separate coverage of core JavaScript (all the keywords, general syntax, and utility objects like Array) from coverage of client-side JavaScript (which includes objects, like History and Event, that have to do with Web browsers and users’ interactions with them. This approach makes this book useful to people using JavaScript for applications other than Web pages. By the way, the other classic JavaScript text–Danny Goodman’s JavaScript Bible–isn’t as current as this book, but it’s still a fantastic (and perhaps somewhat more novice-friendly) guide to the JavaScript language and its capabilities. –David Wall
Topics covered: The JavaScript language (version 1.0 through version 1.5) and its relatives, JScript and ECMAScript, as well as the W3C DOM standards they’re often used to manipulate. Tutorial sections show how to program in JavaScript, while reference sections summarize syntax and options while providing copious code examples.


Download here 


or 

Search Engine Optimization 101

Posted by web2.0 Design resourse 2 comments

What is SEO?

 

 


Search Engine Optimization is the process of increasing the number of visitors by achieving a high position within search results when relevant keywords are searched for. It is common knowledge that people rarely look passed the second or third page of the search results. Optimally you'll want a first page position or even the first result in the first page. However, to accomplish this, you'll need to optimize and code accordingly

Choose the Right Keywords

 

 

Choosing the right keywords can be painless or extremely tricky depending upon the scenario. You'd want to avoid the generic ones since it is going to be extremely difficult to optimize for them. Try to pick keywords that are just as specific as they need to be. For example if you are a freelancer based in Melbourne, your optimal keywords would be freelancer Melbourne or web development services Melbourne. Going for the generic freelancer or web development keywords isn't going to do you any good.
Research your keywords. Know which ones are probably going to be searched for most and go from there.

Focus on the Content

 

 

Content always comes first. It doesn't matter if you perform some dark voodoo to get your site the top place in the results. You'll still need solid content to back that up since the visitors are going to be leaving pretty quickly if they don't find what they are looking for.
Having good, relevant content is the most important aspect of SEO. Your content needs to be suitably useful for the people who you'd want to find your site. You need the content to make sense to the reader. The content needs to appeal to people and make them come back for more.
Having original content is very important. Don't expect to just copy-paste some text from another site, throw in some keywords and call it a day. You need lots and lots of original content with the keywords in the content itself. If people searching for jQuery come to your page, they expect something related to jQuery to be found on your page. Throw in relevant keywords within the content of the page. But don't just spam them sporadically like with tweeting. Your keywords need to be in the appropriate position and of appropriate density. Throw too much keywords around in the content and you are going to be flagged for spam.
Just as important as having original content is having regularly updated content. Fresh content will bring in people and bots alike which in turn will let you get your site indexed with much more frequency which will in turn return fresher results to the search results. But don't update just for the sake of updating. Bots have little incentive to come back if all the updates you perform are only marginally incremental. Just try to have something fresh for the visitors and you should be alright.

Get a Proper Domain Name

 

 


This is a hard to obtain part. If at all possible, get a domain name with the keywords in the domain name itself. www.webdevelopmentaustralia.com is going to have a lot more weight with search engines than www.somecompany.com. Obtaining a domain with the proper keywords should be difficult though.
Domains with the keyword as part of it do look ugly but keep in mind that keywords in the domain name carry very great weight.

Create Pretty URLs


Using a URL scheme where parameters are passed as a query string through the URL make it difficult for search engine spiders to look through your site. More importantly, when you are passing the session ID as part of the URL you are essentially creating a separate URLs for each session with almost the same content. This is probably going to get you penalized for duplicate content. We'll talk about that later on.
Human readable, bot parsable URLs are generally preferred over traditional parameter filled URLs. www.somecompany.com/games/2009/callofduty6 is generally preferred to www.somecompany.com/index.php?cat=game&year=2009&name=callofduty6. When crafting pretty URLs try to avoid days, months and years. www.somecompany.com/blog/seobasics is preferred to www.somecompany.com/blog/2009/09/09/seobasics

Dish out Relevant Page Titles




The text within the title tags: the text that is displayed on your browser's title bar is amongst the most important elements of a page. Actually I'd venture so far as to say it's the most important part of your page after the actual content itself.
Make sure the title is unique for each page and contains relevant keywords. With regards to the title's structure itself Page Title -> Site Name is vastly preferred to Site Name -> Page Title. There are no reasons for you to feel the title needs to be as terse as possible but on the other hand don't try to make it too long. 60 characters is the accepted limit.
Also whilst using keywords in your title text, please don't try to spin it too much. If the search bot sees too many keywords, you are going to be flagged for spam. Remember, the title text is what appears on the search engine result page. You need to convey as much information as possible without sounding too spammy.

Tweak the Meta Elements

 



The meta elements used to matter eons ago when search engine bots were less sophisticated and relied on the meta description and keyword attributes to help them. When this was taken advantage of by spammers, search bots started giving less importance to meta elements.
Having said that, it doesn't hurt to include the meta description element. This is the text used in the description of your site. Try to limit yourselves to 200 characters, keep it simple, grammatically correct and include relevant keywords. Keep the descriptions unique for each page.

Optimize the Page Structure


Layouts vary. Positions of your sidebar or navigation vary too. But with respect to the core markup itself, it's best to put your main content as close to the body tag as possible. If your other elements have to be placed before the content, use CSS to position it before the content instead of moving the markup itself before the content.
Footers are wonderful places to link to other content on your site. Don't just ignore your footer. Place links to recent posts or popular posts in the footer. Having said that, try to not make it look like a link farm.

Use Appropriate Tags


Use the appropriate tags when developing a site. The heading tags are widely under used. People are instead using generic div tags to encapsulate important information. This is wrong. Strictly looking at the markup alone, the heading tags lets us see the informational hierarchy of the page and this applies for the bots too. Use h1 for the title of the post, h2 for each section's heading and so on.
If you are including some code, use the pre tag. If you think some information is important feel free to make it bold. Bots tend to place value on bolded text just like we immediately see what the bolded text. As always, use it sparingly. You don't want to be flagged for spam.

Craft Proper Links


When creating links, try to stay away from the generic read me text. It's not very SEO friendly. Try to include a part of the child link's title to the anchor text itself. This is not as hard as it sounds. For example, instead of using read more, use read more about seo. It doesn't take that much time to change but yields a lot of SEO benefits.
While linking to page on your site, try to make the anchor text as focused as possible. Portfolio is a better candidate than What I've Done. The latter sounds more catchy but the former represents better SEO.

Link Internally


Courtesy of Opera
Don't be scared to interlink the pages in your site. If the number of pages is small, putting it all up on the navigation bar is the way to go. If yours is a big site with a ton of pages, just put all the main category pages on the navigation bar. One way or the other, make sure your pages can be found through links on your site.
Thinking outside the box, you could just as easily include a popular post section on each page. This way you get the interlinking SEO needs and at the same time your visitors can get to see some of the popular posts on your site. It's a win-win situation.

Make your site Accessible


Remember, search engines are meant to bring people to your site. Which means your site is primarily for human parsing. Design with them in mind.
Include alt attributes for all images on your site. This is not only good practice but also a necessity if you want valid markup. If it's appropriate include relevant keywords in the alt text. Remember, search bots can't really look at a picture and decide whether it's relevant or not. Appropriate keywords lets it make that decision. As always don't go overboard on the text. Keep it simple and to the point.
Please don't hide your content behind obnoxious JavaScript or Flash. Spiders can't go through those to get to your content. And without content, the entire point of your site fails. Miserably. Avoid this unless you absolutely have to.

Avoid Duplicate Content


Google is very strict about duplicate content and severely penalizes sites which do so. This is regardless of whether the content is on different domains. If the same, exact content appears on different pages, the page last indexed is going to be penalized.
This is mostly common sense: don't have the same content on each page. The footer text can be repeated with no penalties but not if your footer text is big enough to qualify as an article.
Also, your site may dish out alternative print capable pages which might be seen by the search engine as duplicate content. In this case, use robotx.txt to disallow indexing on these pages.

Use robots.txt


Create a robots.txt file to allow/disallow spiders from certain parts of your site. You just create a file named robots.txt and place it at the root of your web site and all co-operating spiders will respect the rules you've mentioned in the file.
You can do everything from disallowing all bots from accessing a specific folder to disallowing bots from a specific search engine. Read up more about it here.

Create a Site Map


Courtesy of Opera
A site map lets the search engine know about the existence of pages it might not have discovered through spidering through your site normally. Ideally, you should create a normal HTML site map for your users and an XML site map for the search bots. If at all possible, link both.

Avoid Frames


I can't say this enough: frames are bad. Both from a web developer perspective and a seo perspective. Content inside frames are virtually invisible to search engines.
More disturbingly, even if one frame of the page gets indexed and is returned as result the result would take you to just the frame without all of its supporting frames inside the parent document. Frames cause undue confusion to people and virtually stop spiders from crawling through your site. Unless you absolutely have to, don't use frames.

Reduce Code Bloat


And by this I mean 2 things:
Move your JavaScript and CSS to their own separate files. Spiders have no business with them and it is best practice to remove them from the core markup. Create separate files and include them later.
No presentational markup. This is not only SEO friendly but also best practice. Your HTML markup is no place to define how the content should look and similarly the bots have no reason to know how your site is programmed to look. Format the document to your heart's content in your CSS and leave the markup pristine and clean.

Avoid using a Flash Only Navigation


This is common sense but a lot of designers and developers tend to overlook this. Bots can't crawl through flash based content and if the only navigation is flash based, the bot has nothing to crawl through.
If your entire site is flash based, it makes sense to create a text only version for spiders and bots to crawl through and find your content. It'll take extra time to create that but without a text version to fall back on your site will be virtually invisible to search engines.

Use a Common Domain Naming Scheme


Decide on a common naming scheme and stick to it. Personally I prefer www.somename.com but others may like http://somename.com. Decide on a format and stick to it. Use URLs of this format while linking other pages on your site.
Also decide on a whether trailing slashes are required or not. Search engines considers www.somename.com/seo and www.somename.com/seo/ to be different URLs and there is a possibility you are going to be penalized for duplicate content. To get around this, modify your .htaccess file to redirect to the format you like with a 301 redirect. This tells the bot that the page has been moved permanently.

Submit your Site


If your site is newly hatched and hasn't been indexed yet, it's a good idea to get the ball rolling by submitting it to search engines and inspiration galleries. This not only let the search engines get to your site early but also brings in a ton of new traffic and back links.
Do not resort to link submitters unless you absolutely trust it. A lot of these submit your links to a number of link farms, an activity which might get you penalized. Just stick to the big search engines and galleries.

Check for Broken Links


Nothing stops spiders dead in their tracks quicker than broken links specially in the home page. Check thoroughly for broken links to ensure the bots have something to start crawling through your site.
Create a proper 404 page in case the search engine leads the visitor to an old URL. Include appropriate links in the error page.

Get Linked by Peer Sites


This is the massive step that is going to take you a lot of time to get right. Ideally, you'd want a lot of sites linking to your site and your posts . Each link to your site is considered as a vote to your site by the linking site. Getting inbound links from sites catering to the same user base is extremely vital since the current way of ranking relies on the fact that if a lot of sites link back to you then the site must contain relevant information.
Unfortunately, this is a long, arduous and never ending task and only one thing can assure you this: good content. Provide good content and sites will automatically start linking to your content. The more sites link to you, the higher your rank is going to be.
Do not resort to illegal means to get back links. This includes link farms and so. Doing anything like this is going to get you kicked out pretty quickly. Accepted means of getting back links includes reciprocal linking where a site places a link to another site in exchange for that site linking back to the original site.
The way I prefer is to write for Net Tuts. Each article I write nets me a back link and Net Tuts being as large as it is, these contribute heavily to my rankings. Plus it brings in a ton of interested new visitors. :)

Use Appropriate Tools


Tools like Google Analytics helps you analyze and track a number of data including from where your traffic comes from, which pages visitors look at, how much time they spend at each page, how many pages and so on. Use this data to fine tune your site.
Don't forget Google WebMaster tools. It lets you look at the search queries which bring visitors to your page, whether the spider encountered any error while trying to crawl through your site, which sites link to you and more. Invaluable when you are trying to optimize.

Avoid Black Hat Techniques

I can't say this enough: don't try to cheat. Sooner or later, most probably sooner than you think, you are going to be caught and kicked out with no chance of getting listed again. This includes legit sounding techniques like link farms or cross linking to keyword stuffing and keyword dilution.
Just don't do it.

Wait for the Results

At this point, you've hopefully done everything right. The only thing you need to do is sit back, generate some quality content and wait for the rankings to increase. Be patient, this doesn't happen over night but it definitely happens once you have the basics nailed down.

counter

free counters