Skip to content

Commit d89f7c5

Browse files
HIVE-29102: Replace deprecated cwiki links and point to the Website (#6031)
1 parent 2be939c commit d89f7c5

File tree

15 files changed

+24
-24
lines changed

15 files changed

+24
-24
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -63,17 +63,17 @@ Getting Started
6363
===============
6464

6565
- Installation Instructions and a quick tutorial:
66-
https://cwiki.apache.org/confluence/display/Hive/GettingStarted
67-
https://hive.apache.org/development/quickstart/
66+
https://hive.apache.org/development/gettingstarted-latest
67+
https://hive.apache.org/development/quickstart
6868

6969
- Instructions to build Hive from source:
70-
https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-BuildingHivefromSource
70+
https://hive.apache.org/development/gettingstarted-latest/#building-hive-from-source
7171

7272
- A longer tutorial that covers more features of HiveQL:
73-
https://cwiki.apache.org/confluence/display/Hive/Tutorial
73+
https://hive.apache.org/docs/latest/user/tutorial
7474

7575
- The HiveQL Language Manual:
76-
https://cwiki.apache.org/confluence/display/Hive/LanguageManual
76+
https://hive.apache.org/docs/latest/language/languagemanual
7777

7878

7979
Requirements

dev-support/hive-personality.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ function personality_globals
2828
#shellcheck disable=SC2034
2929
PATCH_BRANCH_DEFAULT=master
3030
#shellcheck disable=SC2034
31-
PATCH_NAMING_RULE="http://cwiki.apache.org/confluence/display/Hive/HowToContribute"
31+
PATCH_NAMING_RULE="https://hive.apache.org/community/resources/howtocontribute"
3232
#shellcheck disable=SC2034
3333
JIRA_ISSUE_RE='^HIVE-[0-9]+$'
3434
#shellcheck disable=SC2034

druid-handler/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,4 +18,4 @@ limitations under the License.
1818
-->
1919
# Druid Storage Handler
2020

21-
[Link for documentation]( https://cwiki.apache.org/confluence/display/Hive/Druid+Integration)
21+
[Link for documentation](https://hive.apache.org/docs/latest/user/druid-integration)

hbase-handler/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,4 +18,4 @@ limitations under the License.
1818
-->
1919
# Hbase Storage Handler
2020

21-
[Link for documentation]( https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration)
21+
[Link for documentation](https://hive.apache.org/docs/latest/user/hbaseintegration)

hcatalog/README.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,4 +33,4 @@ For the latest information about HCatalog, please visit our website at:
3333

3434
and our wiki, at:
3535

36-
https://cwiki.apache.org/confluence/display/HCATALOG
36+
https://hive.apache.org/docs/latest/hcatalog/hcatalog-base

hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/schema/HCatFieldSchema.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -264,7 +264,7 @@ private void setName(String name) {
264264
public HCatFieldSchema(String fieldName, Type type, Type mapKeyType, HCatSchema mapValueSchema, String comment) throws HCatException {
265265
assertTypeInCategory(type, Category.MAP, fieldName);
266266
//Hive only supports primitive map keys:
267-
//https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-ComplexTypes
267+
// https://hive.apache.org/docs/latest/language/languagemanual-types/#complex-types
268268
assertTypeInCategory(mapKeyType, Category.PRIMITIVE, fieldName);
269269
this.fieldName = fieldName;
270270
this.type = Type.MAP;

hcatalog/src/test/e2e/hcatalog/tests/hive_nightly.conf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1000,7 +1000,7 @@ $cfg = {
10001000
# Need to test multiple insert - Need harness enhancements
10011001
# Need to test insert into directory - Need harness enhancements
10021002
# Need to test casts
1003-
# Need to test all built in expressions and UDF (see https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF)
1003+
# Need to test all built in expressions and UDF (see https://hive.apache.org/docs/latest/language/languagemanual-udf)
10041004
# Need to test xpath functionality
10051005
# Need to test regular expression based projection
10061006
# Need to test semi joins - Mysql doesn't support, how do I express semi-join?

hcatalog/src/test/e2e/templeton/README.txt

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,9 @@ End to end tests in templeton runs tests against an existing templeton server.
2121
It runs hcat, mapreduce, streaming, hive and pig tests.
2222
This requires Hadoop cluster and Hive metastore running.
2323

24-
It's a good idea to look at current versions of
25-
https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat and
26-
https://cwiki.apache.org/confluence/display/Hive/WebHCat+Configure
24+
It's a good idea to look at current versions of:
25+
https://hive.apache.org/docs/latest/webhcat/webhcat-installwebhcat
26+
https://hive.apache.org/docs/latest/webhcat/webhcat-configure
2727

2828
See deployers/README.txt for help automating some of the steps in this document.
2929

@@ -98,13 +98,13 @@ Tips:
9898
be obtained from Pig and the other two are obtained from your Hadoop distribution.
9999
For Hadoop 1.x you would need to upload hadoop-examples.jar twice to HDFS one as hclient.jar and other as hexamples.jar.
100100
For Hadoop 2.x you would need to upload hadoop-mapreduce-client-jobclient.jar to HDFS as hclient.jar and hadoop-mapreduce-examples.jar to HDFS as hexamples.jar.
101-
Also see https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat#WebHCatInstallWebHCat-HadoopDistributedCache
101+
Also see https://hive.apache.org/docs/latest/webhcat/webhcat-installwebhcat/#hadoop-distributed-cache
102102
for notes on additional JAR files to copy to HDFS.
103103

104104
5. Make sure TEMPLETON_HOME environment variable is set
105105

106106
6. hadoop/conf/core-site.xml should have items described in
107-
https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat#WebHCatInstallWebHCat-Permissions
107+
https://hive.apache.org/docs/latest/webhcat/webhcat-installwebhcat/#permissions
108108

109109
7. Currently Pig tar file available on http://pig.apache.org/ contains jar files compiled to work with Hadoop 1.x.
110110
To run WebHCat tests on Hadoop 2.x you need to build your own Pig tar for Hadoop 2. To do that download the
@@ -173,7 +173,7 @@ and webhcat.proxyuser.hue.hosts defined, i.e. 'hue' should be allowed to imperso
173173
[Of course, 'hcat' proxyuser should be configured in core-site.xml for the command to succeed.]
174174

175175
Furthermore, metastore side file based security should be enabled.
176-
(See https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Authorization#LanguageManualAuthorization-MetastoreServerSecurity for more info)
176+
(See https://hive.apache.org/docs/latest/language/languagemanual-authorization/#hive-authorization-options for more info)
177177

178178
To do this 3 properties in hive-site.xml should be configured:
179179
1) hive.security.metastore.authorization.manager set to

hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@
5454
<!--
5555
enable file based auth for Hive on metastore side, i.e. enforce metadata
5656
security as if it were stored together with data
57-
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Authorization
57+
https://hive.apache.org/docs/latest/language/languagemanual-authorization
5858
<property>
5959
<name>hive.metastore.execute.setugi</name>
6060
<value>true</value>

hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -837,7 +837,7 @@ public EnqueueBean sqoop(@FormParam("command") String command,
837837
* @param srcFile name of hive script file to run, equivalent to "-f" from hive
838838
* command line
839839
* @param hiveArgs additional command line argument passed to the hive command line.
840-
* Please check https://cwiki.apache.org/Hive/languagemanual-cli.html
840+
* Please check https://hive.apache.org/docs/latest/language/languagemanual-cli
841841
* for detailed explanation of command line arguments
842842
* @param otherFiles additional files to be shipped to the launcher, such as the jars
843843
* used in "add jar" statement in hive script

0 commit comments

Comments
 (0)