This article shows you how to configure the Website Settings page.
The Website Settings page contains five data input fields which affect the design and operation of your site. These data inputs should be completed as soon as you start building a website.
- Analytics
- Default Language
- Robots.txt
- Header Script
- Footer Scripts
See also
Locating the settings page
- Click WEBSITE on the top menu bar. The Website Content page displays.
- On the left side menu, select the SETTINGS folder and then click General. The Settings page displays.
Saving changes
After making changes to the data fields, click the green Publish button at the top of the page.
1 Entering analytics code
Baidu Analytics is an SEO tool with a Simplified Chinese interface. It can help improve your website’s performance. With a Baidu Analytics account, you can:
- integrate pay per click marketing
- identify important organic search words
- track individual behaviour on your site
- create heat maps
- access Baidu Webmaster tools
Getting the Baidu analytics script
When you are ready to open a Baidu account, contact Sinorbis. We can obtain and insert the Baidu Analytics script for you.
2 Selecting the default language
- Select the default language for your website.
- Click the Publish button at the top of the page.
Choose a language option from the menu:
- Simplified Chinese (default)
- English (Australia)
- English (UK)
- English (US)
- French
3 Modifying robots.txt file
The information in a robots.txt file tells web crawlers which website pages and directories they can and cannot look at. A web crawler is an automatic program that finds and catalogues web pages as well as online files and documents.
Why use a robots.txt file?
Robots.txt files are useful because they tell search engines to:
- avoid duplicate content on your site (bad for SEO)
- ignore certain areas of your website (e.g. keep some material semi-private, like test pages)
- stay away from certain files on your site (e.g. images, PDFs, etc.)
- locate your sitemap (good for SEO)
Getting a robots.txt file
Free resources on the internet teach you how to create a robots.txt file. For example, Google this phrase:
- create a robots.txt file
Sample code
Here is an example of code that can go in this field. This code is called script. This code tells web crawlers to look at everything on your website:
User-agent: *
Disallow:
Be careful when creating script. A small change can have a big impact. This script, for example, stops web crawlers from looking at all of your content:
User-agent: *
Disallow: /
Pasting code
Once you have created a robots.txt file, paste the script into the robots.txt field.
4 Inserting header script
You don’t need any coding experience to build a website on the Sinorbis platform.
Developers can insert custom code into the HTML head section. This code adds advanced style and tracking elements to the site.
5 Inserting footer scripts
With an easy to use Drag and Drop interface, marketers do not need coding experience to build a website.
Advanced users can insert custom JavaScript code before the body tag. This code helps your site install advanced functions.
6 Understanding sitemaps
Sitemaps are an important part of website marketing. A sitemap tells search engines about your website’s URLs and content structure. The Sinorbis platform automatically generates a sitemap for you.
Comments
0 comments
Please sign in to leave a comment.