将程序生成的世界与其他世界相匹配


18

您读过罗杰·泽拉兹尼(Roger Zelazny)的《琥珀编年史》吗?

想象自己在第三人称MMO游戏中玩。您在世界上产生并开始游荡。一段时间后,当您认为已经学会了地图时,便意识到自己处在一个从未见过的地方。您返回到您确定知道的最后一个地方,并且它仍然存在。但是世界其他地方已经发生了变化,您甚至都没有注意到它是如何发生的。

我读过有关程序世界的信息。我已经阅读了有关Perlin噪声和八度音阶,单纯形噪声,菱形平方算法,有关构造板块和水蚀模拟的信息。我相信我对程序化世代的一般方法有一些模糊的理解。

有了这些知识,我不知道如何才能做上面写的事情。我想到的每个想法都会遇到一些理论问题。这是我可以想到的一些想法:

1)“可逆”世代,以种子号作为输入,并完整描述了一个块号

我怀疑是否有可能,但是我想像一个函数,该函数将接收种子并产生数字矩阵,并在此矩阵上构建块。对于每个唯一编号,都有一个唯一的块。第二个函数,获取该唯一的组块编号并生成包含该编号的种子。我试图在下图中制定一个方案:

在此处输入图片说明

2)使块完全随机,并在它们之间进行过渡。

正如阿修索建议的那样。这种方法的好处是它是可能的,并且不需要魔术功能:)

我认为这种方法的弊端在于,不可能拥有一个多元化的世界。如果让您说一个仅由一个数字表示的群岛和一个大洲,并且它是相邻的块,那么块的大小就不会等于大块。而且我怀疑是否可以在各个块之间进行漂亮的过渡。我想念什么吗?

因此,换句话说,您正在开发具有程序生成世界的MMO。但是,除了拥有一个世界,您还有许多。您将采用哪种方法来生成世界,以及如何在玩家未注意到过渡的情况下实现玩家从一个世界到另一个世界的过渡。

无论如何,我相信您已经有了大致的想法。你会怎么做?


所以我的答案有些问题。@Aracthor我之前曾与您谈到过平滑流形,在这里适用于这种情况。但是,有2个相当高的答案,所以我想知道是否有一点...
Alec Teal

@AlecTeal如果您要添加任何内容,请执行。我很高兴听到任何想法和建议。
netaholic

Answers:


23

使用一片高阶噪声。如果您以前在高度图上使用2d噪波,请改用最后一个坐标固定的3D噪波。现在,您可以缓慢更改最后一个尺寸中的位置以修改地形。由于Perlin噪声在所有维度上都是连续的,因此只要您平滑更改对噪声函数进行采样的位置,就可以获得平滑的过渡。

例如,如果您只想更改距播放器距离的地形作为偏移量。您还可以在地图上存储每个坐标的偏移量,仅增加而不是减少。这样,地图只会更新,而不会更新。

如果您已经在使用3D噪波,那么只需从4D采样即可,这种想法也适用。另外,请看一下Simplex噪声。这是Perlin杂讯的改良版,在更多尺寸时效果更好。


2
这是有趣的。我是否正确理解,建议您生成3d噪点,在其某些z处使用xy切片作为高度图,并通过更改z坐标(随着与玩家的距离增加而增加)来平滑过渡到另一个切片?
netaholic

@netaholic确实如此。将其描述为一个切片是非常好的直觉。此外,您可以跟踪地图上所有位置的最后一个坐标的最大值,而仅增加它,而永不减少它。
danijar

1
这是一个绝妙的主意。基本上,您的地形图是3D体积中的抛物线(或其他曲线)切片。
假名称

这是一个非常聪明的主意。
user253751

5

您将世界分成几个部分的想法不错。只是不完整。

唯一的问题是块之间的连接。例如,如果您使用perlin噪声生成救济,并且每个块都使用不同的种子,并冒险发生这种情况:

块缓解错误

一种解决方案是不仅从其Perlin噪声种子生成块释放,而且还从其周围的其他块生成块释放。

Perlin算法使用周围的随机映射值来“平滑”自身。如果他们使用共同的地图,则将一起平滑。

唯一的问题是,如果玩家退出时更改了块种子以使其与众不同,则您也必须重新加载块,因为它们的边界也应更改。

这不会改变块的大小,但是会增加从播放器到加载/卸载的最小距离,因为必须在播放器看到时加载块,并且使用此方法,因为相邻块也必须。

更新:

如果您的世界的每个部分都属于不同的类型,那么问题就会加剧。这不仅仅是救济。昂贵的解决方案如下:

切块

让我们假设绿色的块是森林世界,蓝色的是群岛世界,黄色的是平坦的沙漠。
此处的解决方案是创建“过渡”区域,在该区域中,起伏和地面性质(以及接地的物体或您想要的任何其他物体)将逐渐从一种类型转变为另一种类型。

正如您在这张图片上所看到的,要编写代码的地狱部分将是块角上的小方块:它们必须在4个块之间建立链接,这可能具有不同的性质。

因此,对于这种复杂性级别,我认为像Perlin2D这样的经典2D世界世代就无法使用。为此,我请您参考@danijar的答案。


您是否建议从种子生成块的“中心”,并基于相邻块生成其边缘“平滑”?这是有道理的,但是它将增加块的大小,因为它应该是一个区域的大小,玩家可以观察到,加上到相邻块的过渡区域的宽度加倍。世界越多样化,块面积就越大。
netaholic

@netaholic不会更大,但是有点。我在上面添加了一段。
Aracthor

我已经更新了我的问题。试图描述我的一些想法
netaholic

因此,这里的另一个答案使用(某种程度上,不是完全)第三维作为图表。另外,您还将飞机视为一个整体,我喜欢您的想法。为了进一步扩展它,您确实需要一个平滑的歧管。您需要确保过渡平滑。然后,您可以对此应用模糊或噪音,答案将是完美的。
亚历克·蒂尔

0

尽管danijar的想法很扎实,但如果要使局部区域相同且距离偏移,最终可能会存储大量数据。并要求越来越多的切片越来越复杂的噪声。您可以以更标准的2d方式获得所有这些。

我开发了一种用于程序生成随机分形噪声的算法,部分基于我固定为无穷和确定性的菱形平方算法。因此,菱形方块可以创建无限的景观,以及我自己的相当块状的算法。

这个想法基本上是相同的。但是,您可以对不同的迭代级别的值进行迭代,而不是对更高维度的噪声进行采样。

因此,您仍然可以存储之前请求的值,并对其进行缓存(此方案可独立用于加速已经超快的算法)。当请求新区域时,将使用新的y值创建该区域。并删除该请求中未请求的任何区域。

因此,而不是在其他维度上遍历不同的空间。我们存储了额外的单调数据,以混合在一起(在不同级别逐渐增加)。

如果用户沿某个方向行进,则值将相应地移动(并在每个级别上移动),并在新的边缘处生成新的值。如果更改了最上面的迭代种子,那么整个世界将发生巨大变化。如果给最终迭代一个不同的结果,那么变化量将是非常小的+ -1块左右。但是,山依旧在那里,山谷依旧,但角落和缝隙将改变。除非您走得足够远,否则山将不见了。

因此,如果我们每次迭代都存储100x100的值块。那么播放器在100x100分辨率下不会有任何变化。但是,在200x200分辨率下,事物可能会发生1块变化。在400x400分辨率下,内容可能会发生2个变化。在800x800分辨率下,事物将可以改变4个块。因此,事情将会改变,并且随着您的发展,它们也会越来越多地发生变化。如果回头,它们将有所不同;如果走得太远,它们将被完全改变并完全丢失,因为所有种子都将被抛弃。

添加一个不同的维度以提供这种稳定效果当然可以,将y移开一定距离,但是当您不必这样做时,您将为许多块存储大量数据。在确定性分形噪声算法中,您可以通过在位置移至特定点之外时添加变化的值(量不同)来获得相同的效果。

https://jsfiddle.net/rkdzau7o/

var SCALE_FACTOR = 2;
//The scale factor is kind of arbitrary, but the code is only consistent for 2 currently. Gives noise for other scale but not location proper.
var BLUR_EDGE = 2; //extra pixels are needed for the blur (3 - 1).
var buildbuffer = BLUR_EDGE + SCALE_FACTOR;

canvas = document.getElementById('canvas');
ctx = canvas.getContext("2d");
var stride = canvas.width + buildbuffer;
var colorvalues = new Array(stride * (canvas.height + buildbuffer));
var iterations = 7;
var xpos = 0;
var ypos = 0;
var singlecolor = true;


/**
 * Function adds all the required ints into the ints array.
 * Note that the scanline should not actually equal the width.
 * It should be larger as per the getRequiredDim function.
 *
 * @param iterations Number of iterations to perform.
 * @param ints       pixel array to be used to insert values. (Pass by reference)
 * @param stride     distance in the array to the next requestedY value.
 * @param x          requested X location.
 * @param y          requested Y location.
 * @param width      width of the image.
 * @param height     height of the image.
 */

function fieldOlsenNoise(iterations, ints, stride, x, y, width, height) {
  olsennoise(ints, stride, x, y, width, height, iterations); //Calls the main routine.
  //applyMask(ints, stride, width, height, 0xFF000000);
}

function applyMask(pixels, stride, width, height, mask) {
  var index;
  index = 0;
  for (var k = 0, n = height - 1; k <= n; k++, index += stride) {
    for (var j = 0, m = width - 1; j <= m; j++) {
      pixels[index + j] |= mask;
    }
  }
}

/**
 * Converts a dimension into the dimension required by the algorithm.
 * Due to the blurring, to get valid data the array must be slightly larger.
 * Due to the interpixel location at lowest levels it needs to be bigger by
 * the max value that can be. (SCALE_FACTOR)
 *
 * @param dim
 * @return
 */

function getRequiredDim(dim) {
  return dim + BLUR_EDGE + SCALE_FACTOR;
}

//Function inserts the values into the given ints array (pass by reference)
//The results will be within 0-255 assuming the requested iterations are 7.
function olsennoise(ints, stride, x_within_field, y_within_field, width, height, iteration) {
  if (iteration == 0) {
    //Base case. If we are at the bottom. Do not run the rest of the function. Return random values.
    clearValues(ints, stride, width, height); //base case needs zero, apply Noise will not eat garbage.
    applyNoise(ints, stride, x_within_field, y_within_field, width, height, iteration);
    return;
  }

  var x_remainder = x_within_field & 1; //Adjust the x_remainder so we know how much more into the pixel are.
  var y_remainder = y_within_field & 1; //Math.abs(y_within_field % SCALE_FACTOR) - Would be assumed for larger scalefactors.

  /*
  Pass the ints, and the stride for that set of ints.
  Recurse the call to the function moving the x_within_field forward if we actaully want half a pixel at the start.
  Same for the requestedY.
  The width should expanded by the x_remainder, and then half the size, with enough extra to store the extra ints from the blur.
  If the width is too long, it'll just run more stuff than it needs to.
  */

  olsennoise(ints, stride,
    (Math.floor((x_within_field + x_remainder) / SCALE_FACTOR)) - x_remainder,
    (Math.floor((y_within_field + y_remainder) / SCALE_FACTOR)) - y_remainder,
    (Math.floor((width + x_remainder) / SCALE_FACTOR)) + BLUR_EDGE,
    (Math.floor((height + y_remainder) / SCALE_FACTOR)) + BLUR_EDGE, iteration - 1);

  //This will scale the image from half the width and half the height. bounds.
  //The scale function assumes you have at least width/2 and height/2 good ints.
  //We requested those from olsennoise above, so we should have that.

  applyScaleShift(ints, stride, width + BLUR_EDGE, height + BLUR_EDGE, SCALE_FACTOR, x_remainder, y_remainder);

  //This applies the blur and uses the given bounds.
  //Since the blur loses two at the edge, this will result
  //in us having width requestedX height of good ints and required
  // width + blurEdge of good ints. height + blurEdge of good ints.
  applyBlur(ints, stride, width + BLUR_EDGE, height + BLUR_EDGE);

  //Applies noise to all the given ints. Does not require more or less than ints. Just offsets them all randomly.
  applyNoise(ints, stride, x_within_field, y_within_field, width, height, iteration);
}



function applyNoise(pixels, stride, x_within_field, y_within_field, width, height, iteration) {
  var bitmask = 0b00000001000000010000000100000001 << (7 - iteration);
  var index = 0;
  for (var k = 0, n = height - 1; k <= n; k++, index += stride) { //iterate the requestedY positions. Offsetting the index by stride each time.
    for (var j = 0, m = width - 1; j <= m; j++) { //iterate the requestedX positions through width.
      var current = index + j; // The current position of the pixel is the index which will have added stride each, requestedY iteration
      pixels[current] += hashrandom(j + x_within_field, k + y_within_field, iteration) & bitmask;
      //add on to this pixel the hash function with the set reduction.
      //It simply must scale down with the larger number of iterations.
    }
  }
}

function applyScaleShift(pixels, stride, width, height, factor, shiftX, shiftY) {
  var index = (height - 1) * stride; //We must iteration backwards to scale so index starts at last Y position.
  for (var k = 0, n = height - 1; k <= n; n--, index -= stride) { // we iterate the requestedY, removing stride from index.
    for (var j = 0, m = width - 1; j <= m; m--) { // iterate the requestedX positions from width to 0.
      var pos = index + m; //current position is the index (position of that scanline of Y) plus our current iteration in scale.
      var lower = (Math.floor((n + shiftY) / factor) * stride) + Math.floor((m + shiftX) / factor); //We find the position that is half that size. From where we scale them out.
      pixels[pos] = pixels[lower]; // Set the outer position to the inner position. Applying the scale.
    }
  }
}

function clearValues(pixels, stride, width, height) {
  var index;
  index = 0;
  for (var k = 0, n = height - 1; k <= n; k++, index += stride) { //iterate the requestedY values.
    for (var j = 0, m = width - 1; j <= m; j++) { //iterate the requestedX values.
      pixels[index + j] = 0; //clears those values.
    }
  }
}

//Applies the blur.
//loopunrolled box blur 3x3 in each color.
function applyBlur(pixels, stride, width, height) {
  var index = 0;
  var v0;
  var v1;
  var v2;

  var r;
  var g;
  var b;

  for (var j = 0; j < height; j++, index += stride) {
    for (var k = 0; k < width; k++) {
      var pos = index + k;

      v0 = pixels[pos];
      v1 = pixels[pos + 1];
      v2 = pixels[pos + 2];

      r = ((v0 >> 16) & 0xFF) + ((v1 >> 16) & 0xFF) + ((v2 >> 16) & 0xFF);
      g = ((v0 >> 8) & 0xFF) + ((v1 >> 8) & 0xFF) + ((v2 >> 8) & 0xFF);
      b = ((v0) & 0xFF) + ((v1) & 0xFF) + ((v2) & 0xFF);
      r = Math.floor(r / 3);
      g = Math.floor(g / 3);
      b = Math.floor(b / 3);
      pixels[pos] = r << 16 | g << 8 | b;
    }
  }
  index = 0;
  for (var j = 0; j < height; j++, index += stride) {
    for (var k = 0; k < width; k++) {
      var pos = index + k;
      v0 = pixels[pos];
      v1 = pixels[pos + stride];
      v2 = pixels[pos + (stride << 1)];

      r = ((v0 >> 16) & 0xFF) + ((v1 >> 16) & 0xFF) + ((v2 >> 16) & 0xFF);
      g = ((v0 >> 8) & 0xFF) + ((v1 >> 8) & 0xFF) + ((v2 >> 8) & 0xFF);
      b = ((v0) & 0xFF) + ((v1) & 0xFF) + ((v2) & 0xFF);
      r = Math.floor(r / 3);
      g = Math.floor(g / 3);
      b = Math.floor(b / 3);
      pixels[pos] = r << 16 | g << 8 | b;
    }
  }
}


function hashrandom(v0, v1, v2) {
  var hash = 0;
  hash ^= v0;
  hash = hashsingle(hash);
  hash ^= v1;
  hash = hashsingle(hash);
  hash ^= v2;
  hash = hashsingle(hash);
  return hash;
}

function hashsingle(v) {
  var hash = v;
  var h = hash;

  switch (hash & 3) {
    case 3:
      hash += h;
      hash ^= hash << 32;
      hash ^= h << 36;
      hash += hash >> 22;
      break;
    case 2:
      hash += h;
      hash ^= hash << 22;
      hash += hash >> 34;
      break;
    case 1:
      hash += h;
      hash ^= hash << 20;
      hash += hash >> 2;
  }
  hash ^= hash << 6;
  hash += hash >> 10;
  hash ^= hash << 8;
  hash += hash >> 34;
  hash ^= hash << 50;
  hash += hash >> 12;
  return hash;
}


//END, OLSEN NOSE.



//Nuts and bolts code.

function MoveMap(dx, dy) {
  xpos -= dx;
  ypos -= dy;
  drawMap();
}

function drawMap() {
  //int iterations, int[] ints, int stride, int x, int y, int width, int height
  console.log("Here.");
  fieldOlsenNoise(iterations, colorvalues, stride, xpos, ypos, canvas.width, canvas.height);
  var img = ctx.createImageData(canvas.width, canvas.height);

  for (var y = 0, h = canvas.height; y < h; y++) {
    for (var x = 0, w = canvas.width; x < w; x++) {
      var standardShade = colorvalues[(y * stride) + x];
      var pData = ((y * w) + x) * 4;
      if (singlecolor) {
        img.data[pData] = standardShade & 0xFF;
        img.data[pData + 1] = standardShade & 0xFF;
        img.data[pData + 2] = standardShade & 0xFF;
      } else {
        img.data[pData] = standardShade & 0xFF;
        img.data[pData + 1] = (standardShade >> 8) & 0xFF;
        img.data[pData + 2] = (standardShade >> 16) & 0xFF;
      }
      img.data[pData + 3] = 255;
    }
  }
  ctx.putImageData(img, 0, 0);
}

$("#update").click(function(e) {
  iterations = parseInt($("iterations").val());
  drawMap();
})
$("#colors").click(function(e) {
  singlecolor = !singlecolor;
  drawMap();
})

var m = this;
m.map = document.getElementById("canvas");
m.width = canvas.width;
m.height = canvas.height;

m.hoverCursor = "auto";
m.dragCursor = "url(data:image/vnd.microsoft.icon;base64,AAACAAEAICACAAcABQAwAQAAFgAAACgAAAAgAAAAQAAAAAEAAQAAAAAAAAEAAAAAAAAAAAAAAgAAAAAAAAAAAAAA////AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAD8AAAA/AAAAfwAAAP+AAAH/gAAB/8AAAH/AAAB/wAAA/0AAANsAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA//////////////////////////////////////////////////////////////////////////////////////gH///4B///8Af//+AD///AA///wAH//+AB///wAf//4AH//+AD///yT/////////////////////////////8=), default";
m.scrollTime = 300;

m.mousePosition = new Coordinate;
m.mouseLocations = [];
m.velocity = new Coordinate;
m.mouseDown = false;
m.timerId = -1;
m.timerCount = 0;

m.viewingBox = document.createElement("div");
m.viewingBox.style.cursor = m.hoverCursor;

m.map.parentNode.replaceChild(m.viewingBox, m.map);
m.viewingBox.appendChild(m.map);
m.viewingBox.style.overflow = "hidden";
m.viewingBox.style.width = m.width + "px";
m.viewingBox.style.height = m.height + "px";
m.viewingBox.style.position = "relative";
m.map.style.position = "absolute";

function AddListener(element, event, f) {
  if (element.attachEvent) {
    element["e" + event + f] = f;
    element[event + f] = function() {
      element["e" + event + f](window.event);
    };
    element.attachEvent("on" + event, element[event + f]);
  } else
    element.addEventListener(event, f, false);
}

function Coordinate(startX, startY) {
  this.x = startX;
  this.y = startY;
}

var MouseMove = function(b) {
  var e = b.clientX - m.mousePosition.x;
  var d = b.clientY - m.mousePosition.y;
  MoveMap(e, d);
  m.mousePosition.x = b.clientX;
  m.mousePosition.y = b.clientY;
};

/**
 * mousedown event handler
 */
AddListener(m.viewingBox, "mousedown", function(e) {
  m.viewingBox.style.cursor = m.dragCursor;

  // Save the current mouse position so we can later find how far the
  // mouse has moved in order to scroll that distance
  m.mousePosition.x = e.clientX;
  m.mousePosition.y = e.clientY;

  // Start paying attention to when the mouse moves
  AddListener(document, "mousemove", MouseMove);
  m.mouseDown = true;

  event.preventDefault ? event.preventDefault() : event.returnValue = false;
});

/**
 * mouseup event handler
 */
AddListener(document, "mouseup", function() {
  if (m.mouseDown) {
    var handler = MouseMove;
    if (document.detachEvent) {
      document.detachEvent("onmousemove", document["mousemove" + handler]);
      document["mousemove" + handler] = null;
    } else {
      document.removeEventListener("mousemove", handler, false);
    }

    m.mouseDown = false;

    if (m.mouseLocations.length > 0) {
      var clickCount = m.mouseLocations.length;
      m.velocity.x = (m.mouseLocations[clickCount - 1].x - m.mouseLocations[0].x) / clickCount;
      m.velocity.y = (m.mouseLocations[clickCount - 1].y - m.mouseLocations[0].y) / clickCount;
      m.mouseLocations.length = 0;
    }
  }

  m.viewingBox.style.cursor = m.hoverCursor;
});

drawMap();
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<canvas id="canvas" width="500" height="500">
</canvas>
<fieldset>
  <legend>Height Map Properties</legend>
  <input type="text" name="iterations" id="iterations">
  <label for="iterations">
    Iterations(7)
  </label>
  <label>
    <input type="checkbox" id="colors" />Rainbow</label>
</fieldset>

By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.