重点摘要
1. 减少HTTP请求以显著提升前端性能
80-90%的终端用户响应时间花在下载页面中的所有组件上。
前端优化至关重要。 虽然后端性能也很重要,但大部分响应时间花在下载页面组件上,如图片、样式表和脚本。通过使用CSS精灵图、图像映射和内联图像等技术减少这些组件的数量,可以显著提高加载时间。将多个JavaScript文件或CSS文件合并成单个文件也能减少HTTP请求。通过专注于最小化组件数量,可以实现前端性能的最大提升。
- 减少HTTP请求的关键技术:
- 使用CSS精灵图合并多个图像
- 使用图像映射进行导航元素
- 使用data: URLs内联图像
- 合并JavaScript和CSS文件
2. 利用浏览器缓存和远期Expires头
远期Expires头通过明确浏览器是否可以使用其缓存的组件副本,消除了与服务器检查的需要。
最大化浏览器缓存。 为静态组件设置远期Expires头(1年或更长时间)允许浏览器长期缓存它们,从而在后续页面访问中消除不必要的HTTP请求。虽然这需要在更新组件时更改文件名,但性能提升是显著的。对于频繁更改的动态生成内容,较短的过期时间是合适的。正确的缓存头可以显著减少服务器负载并提高重复访问者的页面加载时间。
- 远期Expires头的好处:
- 消除缓存组件的HTTP请求
- 减少服务器负载
- 提高重复访问者的页面加载时间
- 注意事项:
- 更新组件时需要更改文件名
- 对于动态内容使用较短的过期时间
3. 使用gzip压缩组件以减少传输时间
Gzip通常能减少约70%的响应大小。
启用gzip压缩。 Gzip压缩可以显著减少HTTP响应的大小,通常约为70%。这减少了传输时间,特别是对连接较慢的用户有益。虽然压缩和解压缩会有少量CPU成本,但带宽节省远远超过这一点。配置您的Web服务器以gzip压缩HTML文档、JavaScript、CSS和其他文本响应。避免对已经压缩的格式如图像和PDF进行gzip压缩。正确实施的gzip压缩是最简单且最有效的性能优化之一。
- 需要gzip压缩的组件:
- HTML文档
- JavaScript文件
- CSS文件
- XML和JSON响应
- 不要gzip压缩:
- 图像
- PDF文件
4. 将CSS放在顶部,将JavaScript放在HTML文档底部
将样式表放在文档底部会禁止许多浏览器的渐进渲染。
优化CSS和JavaScript的位置。 将样式表放在HTML文档的顶部允许浏览器快速加载它们并渐进渲染页面。这能更快地为用户提供视觉反馈。相反,将JavaScript文件放在文档底部可以防止它们阻止其他组件的下载。这种方法平衡了快速的视觉渲染和高效的资源加载。尽可能在页面加载事件后加载非必要的脚本,以进一步提高感知性能。
- CSS位置:
- 放在<head>部分
- 使用<link>标签,而不是@import
- JavaScript位置:
- 放在</body>标签之前
- 在页面加载后加载非必要的脚本
5. 最小化和合并JavaScript和CSS文件
平均而言,JSMin减少了JavaScript文件的大小21%,而Dojo Compressor达到了25%的减少。
减少文件大小。 通过删除注释、空白和不必要的字符来最小化JavaScript和CSS文件,可以显著减少文件大小。像JSMin这样的工具可以自动化这个过程。此外,将多个JavaScript或CSS文件合并成单个文件可以减少HTTP请求。虽然这可能会略微增加初始下载的大小,但它提高了缓存效率并减少了后续页面访问的开销。对于较大的网站,实施一个构建过程来自动处理最小化和合并。
- 最小化技术:
- 删除注释和空白
- 缩短变量和函数名
- 使用更短的语法替代
- 合并策略:
- 按页面或功能分组文件
- 使用服务器端包含或构建工具
6. 使用内容分发网络(CDN)以减少延迟
如果组件Web服务器更靠近用户,许多HTTP请求的响应时间将得到改善。
利用CDN分发静态内容。 内容分发网络将您的静态内容分布在多个地理位置分散的服务器上。这通过从更靠近用户物理位置的服务器提供内容来减少延迟。虽然CDN传统上由大公司使用,但现在许多经济实惠的选项也适用于较小的网站。对于用户群体地理位置分散的网站,CDN特别有效。它们不仅提高了响应时间,还能帮助吸收流量高峰并减少主Web服务器的负载。
- 使用CDN的好处:
- 减少地理位置分散用户的延迟
- 提高处理流量高峰的能力
- 减少源服务器的负载
- 考虑使用CDN的内容:
- 图像
- JavaScript文件
- CSS文件
- 大型可下载文件
7. 优化Ajax请求以提高Web 2.0应用性能
使用Ajax并不保证用户不会在等待那些“异步JavaScript和XML”响应时无所事事。
使Ajax响应可缓存。 虽然Ajax可以改善Web 2.0应用中的用户体验,但它本身并不快。通过使用适当的头使Ajax响应可缓存来优化它们。对于不修改数据的Ajax调用,使用GET请求,允许浏览器缓存响应。通过使用高效的数据格式如JSON来最小化Ajax响应的大小。对于可预测的用户操作,考虑使用Ajax预取数据以提高感知性能。在长时间运行的Ajax请求期间始终向用户提供反馈,以保持响应的感觉。
- Ajax优化技术:
- 对可缓存请求使用GET
- 最小化响应大小(使用JSON)
- 预取可预测的数据
- 对长时间运行的请求提供用户反馈
8. 正确配置ETags以提高缓存效率
由Apache和IIS生成的相同组件的ETags在不同服务器之间不会匹配。
优化或移除ETags。 实体标签(ETags)用于验证缓存组件,但它们在Apache和IIS中的默认实现实际上会在多服务器环境中阻碍缓存。这是因为默认的ETag包含服务器特定的信息,导致不必要的重新验证。对于多服务器设置,要么配置ETags以排除这些信息,要么完全移除它们,改用Last-Modified头。正确配置的ETags可以提高缓存效率,但配置不当的ETags会降低性能。
- 处理ETags的选项:
- 完全移除ETags
- 配置以排除服务器特定信息
- 仅对频繁更改的资源使用
9. 尽量避免重定向以减少往返
重定向是解决许多问题的简单方法,但最好使用不会减慢页面加载的替代解决方案。
消除不必要的重定向。 每个重定向都会创建一个额外的HTTP请求-响应周期,增加页面加载的延迟。常见的重定向用途,如连接网站或跟踪外部链接,通常可以用更高效的替代方案替代。例如,使用服务器端包含或URL重写而不是重定向来连接网站的不同部分。对于跟踪外部链接,考虑使用信标图像或XMLHttpRequest对象而不是重定向。当重定向不可避免时,使用永久(301)重定向以允许浏览器缓存新位置。
- 重定向的替代方案:
- 服务器端包含
- URL重写
- Apache中的Alias或mod_rewrite
- 用于跟踪的信标图像
10. 减少DNS查找以最小化连接延迟
如果您有多个服务器托管您的网站,并且您使用Apache或IIS的默认ETag配置,您的用户会得到更慢的页面,您的服务器负载更高,您消耗更多的带宽,代理服务器无法有效缓存您的内容。
优化DNS使用。 DNS查找会增加HTTP请求的延迟,特别是在移动网络上。减少网页中使用的唯一主机名数量以最小化DNS查找。然而,要平衡这一点与并行下载的需求——使用2-4个主机名可以增加并行化而不会带来过多的DNS开销。使用Keep-Alive连接以减少对同一主机名的重复请求的DNS查找频率。考虑使用内容分发网络(CDN)以减少地理位置分散用户的DNS查找时间。
- 减少DNS影响的策略:
- 每页限制唯一主机名为2-4个
- 使用Keep-Alive连接
- 考虑使用CDN
- 调查关键域的DNS预取
最后更新日期:
FAQ
1. What’s "High Performance Web Sites" by Steve Souders about?
- Focus on Frontend Performance: The book explains that 80–90% of web page load time is spent on the frontend, not the backend, and provides actionable rules to optimize this.
- 14 Performance Rules: Souders presents 14 prioritized rules for making web sites faster, each explained in its own chapter with real-world examples.
- Practical, Data-Driven Advice: The book is based on research and extensive testing, offering practical techniques that can be implemented with minimal effort for significant speed gains.
- Covers a Range of Technologies: Topics include HTTP, caching, compression, JavaScript, CSS, CDNs, DNS, and Ajax, making it a comprehensive guide for web developers.
- Case Studies and Tools: The book analyzes top websites and introduces tools like YSlow to help developers measure and improve performance.
2. Why should I read "High Performance Web Sites" by Steve Souders?
- Immediate Impact on Speed: Implementing even a few of the book’s rules can make web pages noticeably faster, improving user experience and retention.
- Frontend Optimization Focus: It shifts the common misconception that backend is the main bottleneck, showing that frontend changes often yield bigger, faster wins.
- Actionable and Accessible: The rules are clear, concise, and often require only configuration changes or minor code adjustments, making them accessible to most developers.
- Industry Endorsements: The book is praised by leading developers and is considered essential reading for web developers and performance engineers.
- Long-Term Benefits: Most optimizations are one-time tweaks that continue to pay off as your site grows and evolves.
3. What are the key takeaways from "High Performance Web Sites"?
- Frontend Dominates Load Time: The majority of user wait time is due to frontend issues, not backend processing.
- 14 Rules for Speed: Following the book’s prioritized rules can reduce page load times by 25–50% or more.
- Caching and Fewer Requests: Techniques like reducing HTTP requests, leveraging browser caching, and combining files are among the most effective.
- Measure and Profile: Always profile your site to identify where the biggest gains can be made, using tools like YSlow and Firebug.
- Continuous Improvement: Performance is an ongoing process; regularly review and update your practices as your site and technology evolve.
4. What is the "Performance Golden Rule" in "High Performance Web Sites" by Steve Souders?
- Backend vs. Frontend Time: Only 10–20% of end user response time is spent downloading the HTML document; the remaining 80–90% is spent on frontend components.
- Optimization Focus: To achieve the greatest performance improvements, focus on optimizing the frontend—images, scripts, stylesheets, and other resources.
- Diminishing Returns on Backend: Halving backend response time yields only minor overall gains, while halving frontend time can cut total load time by up to 45%.
- Easier and Faster to Implement: Frontend optimizations typically require less time and fewer resources than backend changes.
- Proven Results: Many teams at Yahoo! and elsewhere have achieved 25% or greater reductions in response times by following this rule.
5. What are the 14 rules for high performance web sites according to Steve Souders?
- Rule 1: Make Fewer HTTP Requests: Reduce the number of components (images, scripts, stylesheets) to minimize HTTP overhead.
- Rule 2: Use a Content Delivery Network (CDN): Serve static content from geographically distributed servers to reduce latency.
- Rule 3: Add an Expires Header: Leverage browser caching by setting far-future Expires or Cache-Control headers on static resources.
- Rule 4: Gzip Components: Compress HTML, CSS, and JavaScript to reduce transfer size and speed up downloads.
- Rule 5: Put Stylesheets at the Top: Place CSS in the document HEAD to enable progressive rendering and avoid blank screens.
- Rule 6: Put Scripts at the Bottom: Move JavaScript to the end of the page to prevent blocking rendering and parallel downloads.
- Rule 7: Avoid CSS Expressions: Don’t use dynamic CSS expressions in IE, as they can cause severe performance issues.
- Rule 8: Make JavaScript and CSS External: Use external files for better caching and reuse, except in rare cases like homepages.
- Rule 9: Reduce DNS Lookups: Limit the number of unique hostnames to minimize DNS resolution delays.
- Rule 10: Minify JavaScript: Remove unnecessary characters from code to reduce file size and improve load times.
- Rule 11: Avoid Redirects: Eliminate unnecessary HTTP redirects, which delay page loading.
- Rule 12: Remove Duplicate Scripts: Ensure scripts are included only once to avoid extra downloads and execution.
- Rule 13: Configure ETags: Remove or standardize ETags to prevent cache misses in multi-server environments.
- Rule 14: Make Ajax Cacheable: Apply caching and other performance rules to Ajax requests for dynamic applications.
6. How does "High Performance Web Sites" by Steve Souders recommend reducing HTTP requests, and why is this important?
- Biggest Impact on Load Time: Reducing HTTP requests is the single most effective way to speed up initial page loads, especially for first-time visitors.
- Techniques Provided: Use image maps, CSS sprites, inline images (with data: URLs), and combine scripts and stylesheets to cut down the number of requests.
- Balance Design and Performance: These methods allow you to maintain rich designs without sacrificing speed.
- Development Workflow: Modularize code during development, but use a build process to combine files for production.
- Real-World Results: Examples in the book show up to 50% faster load times by applying these techniques.
7. What is the role of caching and the Expires header in "High Performance Web Sites"?
- Browser Caching: Setting a far-future Expires or Cache-Control header allows browsers to reuse cached resources, reducing repeat HTTP requests.
- Primed vs. Empty Cache: Most users and page views benefit from caching, as a significant percentage revisit sites with a primed cache.
- Beyond Images: Apply caching headers to all static resources—images, scripts, stylesheets, and Flash—not just images.
- Filename Versioning: When updating resources, change filenames (e.g., add version numbers) to ensure users get the latest versions.
- Dramatic Speed Gains: Proper caching can cut response times for repeat visits by 50% or more.
8. How does "High Performance Web Sites" by Steve Souders address compression and file size reduction?
- Gzip Compression: Enable gzip (or deflate) on your server for HTML, CSS, and JavaScript to reduce file sizes by about 70%.
- What to Compress: Only compress text-based files; images and PDFs are already compressed and don’t benefit.
- Configuration Tips: The book provides Apache configuration examples for mod_gzip and mod_deflate, and discusses handling edge cases with proxies and browser bugs.
- Minification: Remove comments and whitespace from JavaScript (and CSS) using tools like JSMin for further size reduction.
- Combined Effect: Gzipping and minifying together maximize bandwidth savings and speed.
9. What are the best practices for JavaScript and CSS placement and management in "High Performance Web Sites"?
- External Files Preferred: Use external JavaScript and CSS files for better caching and reuse, except in special cases like homepages.
- Placement Matters: Put stylesheets in the HEAD for progressive rendering; place scripts at the bottom to avoid blocking downloads and rendering.
- Combining Files: Merge multiple scripts and stylesheets into single files to minimize HTTP requests.
- Minify and Gzip: Always minify and gzip these files to reduce size and improve load times.
- Special Techniques: For homepages, consider inlining or dynamic inlining with post-onload downloads to balance speed and caching.
10. How does "High Performance Web Sites" by Steve Souders recommend handling DNS lookups and redirects?
- Reduce Unique Hostnames: Limit the number of domains used for resources to minimize DNS resolution delays (each lookup can take 20–120ms).
- Balance Parallel Downloads: Use 2–4 hostnames to allow parallel downloads without excessive DNS lookups.
- Keep-Alive Connections: Enable persistent connections to further reduce DNS lookups and connection overhead.
- Avoid Redirects: Eliminate unnecessary HTTP redirects, as they delay the delivery of the HTML document and all resources.
- Alternatives to Redirects: Use server configuration (Alias, mod_rewrite) and referer logging instead of redirects for tracking and URL management.
11. What are ETags, and why does "High Performance Web Sites" by Steve Souders advise configuring or removing them?
- ETag Definition: ETags are unique identifiers for resource versions, used by browsers to validate cached content.
- Multi-Server Issues: Default ETag formats in Apache and IIS include server-specific data, causing cache misses when requests are served by different servers.
- Performance Impact: Mismatched ETags force unnecessary downloads, increasing load times and bandwidth usage.
- Recommended Solution: Remove ETags or configure them to be consistent across servers, relying on Last-Modified headers for cache validation.
- Real-World Prevalence: Many top sites have not addressed this, leading to avoidable performance penalties.
12. How does "High Performance Web Sites" by Steve Souders apply its rules to Ajax and modern web applications?
- Ajax Performance Matters: The same frontend rules
评论
《高性能网站》获得了褒贬不一的评价,但总体上反响积极。许多读者认为这本书对网页开发者来说信息丰富且必不可少,称赞其在提升网站性能方面提供的实用建议。一些人强调尽管书籍年代久远,但其内容依然具有持续的相关性,而另一些人则认为其内容已经过时。该书因解释优化技术背后的原理并提供具体示例而受到赞赏。批评者指出,现在一些信息可以在网上或通过像YSlow这样的工具轻松获取。总体而言,这本书被视为理解网页性能基础知识的宝贵资源。