This article shows you how to configure the Settings page.
The Settings page contains several data input fields and toggle switches that affect the design and operation of your site. These settings should be configured as soon as you start building a website.
- Analytics
- Default Language
- Robots.txt
- Header Script
- Footer Scripts
- Lazy Loading of Images
- Dynamic Spacing
See also
Locating the settings page
Click PUBLISH on the top menu bar. A popup window displays. Click Website Content > Settings > General. The Settings page displays.
Saving changes
Click the green Publish button at the top of the page to save your changes
1 Entering analytics code
Baidu Analytics is an SEO tool with a Simplified Chinese interface. It can help improve your website’s performance. With a Baidu Analytics account, you can:
- integrate pay per click marketing
- identify important organic search words
- track individual behaviour on your site
- create heat maps
-
access Baidu Webmaster tools
Getting the Baidu analytics script
When you're ready to open a Baidu account, contact Sinorbis. We can obtain and insert the Baidu Analytics script for you.
2 Selecting the default language
To set the default language for your website, click the field and select an option on the drop-down menu.
3 Modifying robots.txt file
The information in a robots.txt file tells web crawlers which website pages and directories they can and cannot look at. A web crawler is an automatic program that finds and catalogues web pages as well as online files and documents.
Why use a robots.txt file?
Robots.txt files are useful because they tell search engines to:
- avoid duplicate content on your site (bad for SEO)
- ignore certain areas of your website (e.g. keep some material semi-private, like test pages)
- stay away from certain files on your site (e.g. images, PDFs, etc.)
- locate your sitemap (good for SEO)
Getting a robots.txt file
Free resources on the internet teach you how to create a robots.txt file. For example, Google this phrase:
- create a robots.txt file
Sample code
Here is an example of code that can go in this field. This code is called a script. This code tells web crawlers to look at everything on your website:
User-agent: *
Disallow:
Be careful when creating a script. A small change can have a big impact. This script, for example, stops web crawlers from looking at all of your content:
User-agent: *
Disallow: /
Pasting code
Once you have created a robots.txt file, paste the script into the robots.txt field. Click Publish to save.
4 Inserting header script
You don’t need any coding experience to build a website on the Sinorbis platform. Developers can insert custom code into the HTML head section. This code adds advanced style and tracking elements to the site.
5 Inserting footer scripts
With an easy to use Drag and Drop interface, marketers do not need coding experience to build a website. Advanced users can insert custom JavaScript code before the body tag. This code helps your site install advanced functions.
6 Allowing lazy loading of images
If your site has lots of images, this feature can improve page speed by loading images only when they are needed. Switch on the toggle to enable this feature.
7 Allowing dynamic spacing
Sinorbis auto-adjusts padding and margins to improve readability on tablets and mobile screens. Switch on the toggle to enable this feature.
8 Understanding sitemaps
Sitemaps are an important part of website marketing. A sitemap tells search engines about your website’s URLs and content structure. The Sinorbis platform automatically generates a sitemap for you.
Comments
0 comments
Please sign in to leave a comment.